Abstract | ||
---|---|---|
The behaviour of a norm-driven agent is governed by obligations, permissions and prohibitions. Agents joining a society or accepting a contract for the purpose of executing specific collaborative tasks usually have to adopt norms representing certain rules and regulations. Adoption of norms can cause problems - an agent maybe already hold norms that would be in conflict or inconsistent with new norms it adopts. How can it be shown that the set of norms is consistent to allow the agent to act according to the ideals that the norms specify? In general, the answer to such a question in a real-world situation is not simple. This paper addresses the problem of finding a pragmatic solution to the problem of norm consistency checking for practical reasoning agents in the context of the NoA Normative Agent Architecture. |
Year | DOI | Venue |
---|---|---|
2003 | 10.1007/978-3-540-25936-7_9 | Lecture Notes in Artificial Intelligence |
Keywords | Field | DocType |
agent architecture,practical reasoning,pragmatics,contract,artificial intelligence,regulation | Autonomous agent,Pragmatics,Practical reason,Computer science,Normative,Norm (social),Agent architecture,Contract management,Strong consistency,Distributed computing | Conference |
Volume | ISSN | Citations |
3067 | 0302-9743 | 8 |
PageRank | References | Authors |
0.69 | 15 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Martin J. Kollingbaum | 1 | 390 | 33.38 |
Timothy J. Norman | 2 | 1417 | 140.04 |