Development and testing of psychological conflict resolution strategies for assertive robots to resolve human–robot goal conflict
peer-reviewed
Erstveröffentlichung
2021-01-26Authors
Babel, Franziska
Kraus, Johannes Maria
Baumann, Martin
Wissenschaftlicher Artikel
Published in
Frontiers in Robotics and AI ; 7 (2021). - Art.-Nr. 591448. - eISSN 2296-9144
Link to original publication
https://dx.doi.org/10.3389/frobt.2020.591448Faculties
Fakultät für Ingenieurwissenschaften, Informatik und PsychologieInstitutions
Institut für Psychologie und PädagogikDocument version
published version (publisher's PDF)Abstract
As service robots become increasingly autonomous and follow their own task-related
goals, human-robot conflicts seem inevitable, especially in shared spaces. Goal conflicts
can arise from simple trajectory planning to complex task prioritization. For successful
human-robot goal-conflict resolution, humans and robots need to negotiate their goals
and priorities. For this, the robot might be equipped with effective conflict resolution
strategies to be assertive and effective but similarly accepted by the user. In this paper,
conflict resolution strategies for service robots (public cleaning robot, home assistant
robot) are developed by transferring psychological concepts (e.g., negotiation,
cooperation) to HRI. Altogether, fifteen strategies were grouped by the expected
affective outcome (positive, neutral, negative). In two online experiments, the
acceptability of and compliance with these conflict resolution strategies were tested
with humanoid and mechanic robots in two application contexts (public: n1 = 61;
private: n2 = 93). To obtain a comparative value, the strategies were also applied by a
human. As additional outcomes trust, fear, arousal, and valence, as well as perceived
politeness of the agent were assessed. The positive/neutral strategies were found to be
more acceptable and effective than negative strategies. Some negative strategies
(i.e., threat, command) even led to reactance and fear. Some strategies were only
positively evaluated and effective for certain agents (human or robot) or only
acceptable in one of the two application contexts (i.e., approach, empathy). Influences
on strategy acceptance and compliance in the public context could be found: acceptance
was predicted by politeness and trust. Compliance was predicted by interpersonal power.
Taken together, psychological conflict resolution strategies can be applied in HRI to
enhance robot task effectiveness. If applied robot-specifically and context-sensitively they
are accepted by the user. The contribution of this paper is twofold: conflict resolution
strategies based on Human Factors and Social Psychology are introduced and empirically
evaluated in two online studies for two application contexts. Influencing factors and
requirements for the acceptance and effectiveness of robot assertiveness are discussed.
Publication funding
Open-Access-Förderung durch die Universität Ulm
Is supplemented by
https://www.frontiersin.org/articles/10.3389/frobt.2020.591448/full#supplementary-materialSubject headings
[GND]: Akzeptanz[LCSH]: Social acceptance | Trust
[Free subject headings]: HRI strategies | Robot assertiveness | Persuasive robots | User compliance | acceptance
[DDC subject group]: DDC 620 / Engineering & allied operations
Metadata
Show full item recordDOI & citation
Please use this identifier to cite or link to this item: http://dx.doi.org/10.18725/OPARU-34980
Babel, Franziska; Kraus, Johannes Maria; Baumann, Martin (2021): Development and testing of psychological conflict resolution strategies for assertive robots to resolve human–robot goal conflict. Open Access Repositorium der Universität Ulm und Technischen Hochschule Ulm. http://dx.doi.org/10.18725/OPARU-34980
Citation formatter >