"The future of artificial intelligence at work: A review on effects of decision automation and augmentation on workers targeted by algorithms and third-party observers"
by Langer, Markus; Landers, Richard N (2021)
Abstract
Advances in artificial intelligence are increasingly leading to the automation and augmentation of decision processes in work contexts. Although research originally generally focused upon decision-makers, the perspective of those targeted by automated or augmented decisions (whom we call “second parties”) and parties who observe the effects of such decisions (whom we call “third parties”) is now growing in importance and attention. We review the expanding literature investigating reactions to automated and augmented decision-making by second and third parties. Specifically, we explore attitude (e.g., evaluations of trustworthiness), perception (e.g., fairness perceptions), and behavior (e.g., reverse engineering of automated decision processes) outcomes of second and third parties. Additionally, we explore how characteristics of the a) decision-making process, b) system, c) second and third party, d) task, and e) outputs and outcomes moderate these effects, and provide recommendation for future research. Our review summarizes the state of the literature in these domains, concluding a) that reactions to automated decisions differ across situations in which there is remaining human decision control (i.e., augmentation contexts), b) that system design choices (e.g., transparency) are important but underresearched, and c) that the generalizability of findings might suffer from excessive reliance on specific research methodologies (e.g., vignette studies).
Keywords
Automated And Augmented Decision-Making, Artificial Intelligence, Algorithmic Decision-Making, Perceptions, Attitudes, Review PaperThemes
AutomationLinks to Reference
- https://www.sciencedirect.com/science/article/pii/S0747563221002016
- http://dx.doi.org/10.1016/j.chb.2021.106878
- https://www.sciencedirect.com/science/article/abs/pii/S0747563221002016?via%3Dihub
Citation
Share
How to contribute.