Formal Analysis of Models for the Dynamics of Trust based on Experiences Catholijn M. Jonker and Jan Treur Vrije Universiteit Amsterdam, Department of Artificial Intelligence De Boelelaan 1081, 1081 HV Amsterdam, The Netherlands Email: {jonker, treur}@cs.vu.nl URL: http://www.cs.vu.nl/{~jonker,~treur} Abstract. The aim of this paper is to analyse and formalise the dynamics of trust in the light of experiences. A formal framework is introduced for the analysis and specification of models for trust evolution and trust update. Different properties of these models are formally defined. 1 Introduction Trust is the attitude an agent has with respect to the dependability/capabilities of some other agent (maybe itself) or with respect to the turn of events. The agent might for example trust that the statements made by another agent are true. The agent might trust the commitment of another agent with respect to a certain (joint) goal. The agent might trust that another agent is capable of performing certain tasks. The agent might trust itself to be able to perform some tasks. The agent might trust that the current state of affairs will lead to a state of affairs that is agreeable to its own intentions, goals, commitments, or desires. In [1], [2] the importance of the notion trust is shown for agents, multi-agent systems, and their foundations. From the viewpoint of the users of agent systems Ousterhout [10] makes clear that work can only be delegated to such systems if they can be trusted without there being a constant need for inspection of their work. Elofson [4] states that the reach and effect of trust in the affairs of individuals and organizations is largely pervasive. Elofson continues with the problem that trust is somewhat illusive, difficult to define, difficult to create, and difficult to measure. Before focusing on the difficulties regarding the creation and measurement of trust, a brief survey is made of definitions of trust, for more information see [4], [5]. Trust of an agent in another agent (social trust) is sometimes defined as a kind of binary property, for example, an agent A trusting another agent B means that A believes that B will act in a way that is favorable to A, even though that act might not be most convenient to B at that moment [5]. A shorter variant is that of Demolombe [3]: “We can understand trust as an attitude of an agent who believes that another agent has a given property.” Another definition of trust, describes the notion as a subjective probability [5]. Common in these definitions is that the trusting agent A has a specific interest in the actions of the agent B that is trusted by A, and that B will act with respect to this interest even though it might seem that doing so is not