人机交互中,信任的影响因素有哪些?

精选

精选

· 2013.06.13

研究发现与机器人自身相关的因素,尤其是它们的性能(如可靠性、误警率和故障率等)对信任有很大影响,而与机器人特性相连的因素(如亲近性、机器人个性、拟人化等)则仅有相对较小的影响。

美国的多名研究者定量评估了人、机器人以及环境等三大因素在人机交互中对信任的影响,研究发现与机器人自身相关的因素,尤其是它们的性能(如可靠性、误警率和故障率等)对信任有很大影响,而与机器人特性相连的因素(如亲近性、机器人个性、拟人化等)则仅有相对较小的影响。另外,环境因素对信任有适度影响,而很少有证据表明与人相关的因素会对信任产生影响。

人机交互中信任的影响因素

【文章全文】A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction

Objective: We evaluate and quantify the effects of human, robot, and environmental factors on perceived trust in human-robot interaction (HRI).

Background: To date, reviews of trust in HRI have been qualitative or descriptive. Our quantitative review provides a fundamental empirical foundation to advance both theory and practice.

Method: Meta-analytic methods were applied to the available literature on trust and HRI. A total of 29 empirical studies were collected, of which 10 met the selection criteria for correlational analysis and 11 for experimental analysis. These studies provided 69 correlational and 47 experimental effect sizes.

Results: The overall correlational effect size for trust was r– = +0.26, with an experimental effect size of d– = +0.71. The effects of human, robot, and environmental characteristics were examined with an especial evaluation of the robot dimensions of performance and attribute-based factors. The robot performance and attributes were the largest contributors to the development of trust in HRI. Environmental factors played only a moderate role.

Conclusion: Factors related to the robot itself, specifically, its performance, had the greatest current association with trust, and environmental factors were moderately associated. There was little evidence for effects of human-related factors.

Application: The findings provide quantitative estimates of human, robot, and environmental factors influencing HRI trust. Specifically, the current summary provides effect size estimates that are useful in establishing design and training guidelines with reference to robot-related factors of HRI trust. Furthermore, results indicate that improper trust calibration may be mitigated by the manipulation of robot design. However, many future research needs are identified.

Keywords: trust, trust development, robotics, human-robot team

 

【文章作者】Peter A. Hancock, Deborah R. Billings, and Kristin E. Schaefer, University of Central Florida, Jessie Y. C. Chen, U.S. Army Research Laboratory, and Ewart J. de Visser, and Raja Parasuraman, George Mason University

【文章来源】Human Factors: The Journal of the Human Factors and Ergonomics Society

【原文链接】http://hfs.sagepub.com/content/53/5/517

本文系作者精选授权钛媒体发表,并经钛媒体编辑,转载请注明出处、作者和本文链接
想和千万钛媒体用户分享你的新奇观点和发现,点击这里投稿 。创业或融资寻求报道,点击这里

敬原创,有钛度,得赞赏

”支持原创,赞赏一下“
钛哥儿 钛粉85193 那只猫已转身不见 一潭浑水 钛粉59301 钛粉14259
380人已赞赏 >
380换成打赏总人数380人赞赏钛媒体文章
关闭弹窗

挺钛度,加点码!

  • ¥ 5
  • ¥ 10
  • ¥ 20
  • ¥ 50
  • ¥ 100

支付方式

确认支付
关闭弹窗

支付

支付金额:¥6

关闭弹窗
sussess

赞赏金额:¥ 6

赞赏时间:2020.02.11 17:32

关闭弹窗 关闭弹窗

Oh! no

您是否确认要删除该条评论吗?

注册邮箱未验证

我们已向下方邮箱发送了验证邮件,请查收并按提示验证您的邮箱。

如果您没有收到邮件,请留意垃圾邮件箱。

更换邮箱

您当前使用的邮箱可能无法接收验证邮件,建议您更换邮箱

账号合并

经检测,你是“钛媒体”和“商业价值”的注册用户。现在,我们对两个产品因进行整合,需要您选择一个账号用来登录。无论您选择哪个账号,两个账号的原有信息都会合并在一起。对于给您造成的不便,我们深感歉意。