A norm trust framework for norms adoption in normative multi-agent systems
Researchers in normative multi-agent systems have emphasized the importance of equipping agents with the ability to detect and learn the norms of a new environment. They propose active learning approaches and prove that agents are capable of detecting norms using these approaches. However, most of t...
Saved in:
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Published: |
Penerbit UTM Press
2023
|
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Researchers in normative multi-agent systems have emphasized the importance of equipping agents with the ability to detect and learn the norms of a new environment. They propose active learning approaches and prove that agents are capable of detecting norms using these approaches. However, most of their works entail agents that detect one norm in an event. We argue that these approaches do not help agents to decide in cases of norms coexistence is detected in an event. To solve this problem, we introduce the concept of norms trust to help agents decide which detected norms are credible in a new environment. In this paper, we propose a conceptual norms trust framework by inferring norms trust through two-tier assessment; credible agent evaluation and norms trust assessment. Norms trust assessment is based on filter factors of norm adoption ratio, norm adoption risk, and norms salience. The framework assesses norms trust value for each detected norm. This value is then used by the agent to decide either to only emulate or fully internalize the detected norms. � 2015 Penerbit UTM Press. All rights reserved. |
---|