Content deleted Content added
Citation bot (talk | contribs) Add: pages, volume, series, s2cid, authors 1-3. Removed proxy/dead URL that duplicated identifier. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Corvus florensis | #UCB_webform 1801/2999 |
SuntoothAWB (talk | contribs) Added clean-up templates, removed speculation, added orphan tag, typo(s) fixed: etc) → etc.) |
||
Line 1:
{{Short description|Biased assessment of an algorithm}}
{{multiple issues|
{{essay|date=October 2023}}
{{Orphan|date=October 2023}}
}}
'''Algorithm aversion''' is "biased assessment of an algorithm which manifests in negative behaviours and attitudes towards the algorithm compared to a human agent."<ref name=":0">{{Cite journal|last1=Jussupow|first1=Ekaterina|last2=Benbasat|first2=Izak|last3=Heinzl|first3=Armin|date=2020|title=Why Are We Averse Towards Algorithms ? A Comprehensive Literature Review on Algorithm Aversion|url=https://aisel.aisnet.org/ecis2020_rp/168/|journal=Twenty-Eighth European Conference on Information Systems (ECIS2020)|pages=1–16}}</ref> It describes a phenomenon where humans reject advice from an algorithm in a case where they would accept the same advice if they thought it was coming from another human.
[[Algorithm]]s, such as those employing [[machine learning]] methods or various forms of [[artificial intelligence]], are commonly used to provide recommendations or advice to human decisionmakers. For example, [[
This is an emerging topic and it is not completely clear why or under what circumstances people will display algorithm aversion. In some cases, people seem to be more likely to take recommendations from an algorithm than from a human, a phenomenon called ''algorithm appreciation''.<ref name=":1">{{Cite journal|date=2019-03-01|title=Algorithm appreciation: People prefer algorithmic to human judgment|url=https://www.sciencedirect.com/science/article/abs/pii/S0749597818303388|journal=Organizational Behavior and Human Decision Processes|language=en|volume=151|pages=90–103|doi=10.1016/j.obhdp.2018.12.005|issn=0749-5978|last1=Logg|first1=Jennifer M.|last2=Minson|first2=Julia A.|last3=Moore|first3=Don A.}}</ref>
Line 13 ⟶ 18:
=== Decision control ===
Algorithms may either be used in an ''advisory'' role (providing advice to a human who will make the final decision) or in an ''delegatory'' role (where the algorithm makes a decision without human supervision). A movie recommendation system providing a list of suggestions would be in an ''advisory'' role, whereas the human driver ''delegates'' the task of steering the car to [[Tesla Autopilot|Tesla's Autopilot]]. Generally, a lack of decision control tends to increase algorithm aversion.{{citation needed|date=October 2023}}
=== Perceptions about algorithm capabilities and performance ===
Line 19 ⟶ 24:
==== Algorithm Process and the role of system transparency ====
One reason people display resistance to algorithms is a lack of understanding about how the algorithm is arriving at its recommendation.<ref name=":2" /> People also seem to have a better intuition for how another human would make recommendations. Whereas people assume that other humans will account for unique differences between situations, they sometimes perceive algorithms as incapable of considering individual differences and resist the algorithms accordingly.<ref>{{Cite journal|last1=Longoni|first1=Chiara|last2=Bonezzi|first2=Andrea|last3=Morewedge|first3=Carey K|date=2019-05-03|title=Resistance to Medical Artificial Intelligence|journal=Journal of Consumer Research|volume=46|issue=4|pages=629–650|doi=10.1093/jcr/ucz013|issn=0093-5301|doi-access=free}}</ref>
==== Decision ___domain ====
Line 32 ⟶ 37:
==== Culture ====
Different cultural norms and influences may cause people to respond to algorithmic recommendations differently. The way that recommendations are presented (e.g., language, tone, etc.) may cause people to respond differently.{{citation needed|date=October 2023}}
==== Age ====
[[Digital natives]] are younger and have known technology their whole lives, while digital immigrants have not. Age is a commonly-cited factor hypothesized to affect whether or not people accept algorithmic recommendations. For example, one study found that trust in an algorithmic financial advisor was lower among older people compared with younger study participants.<ref>{{Cite journal|date=2020-02-01|title=Whose Algorithm Says So: The Relationships Between Type of Firm, Perceptions of Trust and Expertise, and the Acceptance of Financial Robo-Advice|url=https://www.sciencedirect.com/science/article/pii/S1094996819301112|journal=Journal of Interactive Marketing|language=en|volume=49|pages=107–124|doi=10.1016/j.intmar.2019.10.003|issn=1094-9968|hdl=1765/123799|hdl-access=free|last1=Lourenço |first1=Carlos J.S. |last2=Dellaert |first2=Benedict G.C. |last3=Donkers |first3=Bas |s2cid=211029562 }}</ref> However, other research has found that algorithm aversion does not vary with age.<ref name=":1" />
== Proposed methods to overcome algorithm aversion ==
|