1,132
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Justice, trust, and moral judgements when personnel selection is supported by algorithms

ORCID Icon, ORCID Icon, & ORCID Icon
Pages 130-145 | Received 03 Jan 2022, Accepted 11 Jan 2023, Published online: 20 Feb 2023
 

ABSTRACT

Although algorithm-based systems are increasingly used as a decision-support for managers, there is still a lack of research on the effects of algorithm use and more specifically on potential algorithmic bias on decision-makers. To investigate how potential social bias in a recommendation outcome influences trust, fairness perceptions, and moral judgement, we used a moral dilemma scenario. Participants (N = 215) imagined being human resource managers responsible for personnel selection and receiving decision-support from either human colleagues or an algorithm-based system. They received an applicant preselection that was either gender-balanced or predominantly male. Although participants perceived algorithm-based support as less biased, they also perceived it as generally less fair and had less trust in it. This could be related to the finding that participants perceived algorithm-based systems as more consistent but also as less likely to uphold moral standards. Moreover, participants tended to reject algorithm-based preselection more often than human-based and were more likely to use utilitarian judgements when accepting it, which may indicate different underlying moral judgement processes.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1. Preregistered under. In the preregistration, we only assumed trust to be higher for the human decision-support agent. However, recent literature (e.g., Newman et al., Citation2020) implies possible effects also for procedural justice and fairness. Additionally, this implies a higher acceptance-rate of the preselection coming from a human. We also preregistered hypotheses for further variables. Due to readability, we decided to not report results for these concepts. The raw data will be made available upon publication and results can be made available upon request.

2. In this study, we captured moral judgements (deontological vs. utilitarian) in two ways: First, we used a validated scale from Tanner et al. (Citation2008) in which the importance of different deontological and utilitarian reasons for a decision are to be rated (here, acceptance or rejection of the preselection recommendation). However, since these are abstract reasons, participants were also asked to state their main reason for their decision, which were subsequently assigned to the categories “deontological” or “utilitarian.”.

3. The moral judgements-hypotheses were preregistered in general terms (i.e., for both types of measurement combined). To generate a better understanding, we specified them here for the different measurements. In addition, due to the assumption that rejection of the preselection is based on deontological reasons for both human and algorithm-based support, we added hypotheses 4b and 5b because they logically follow the previous assumptions.

4. We chose this job because, in Germany, a similar number of men and women applying for it (Statistisches Bundesamt, Citation2019).

5. In addition to those mentioned, we captured further measures in the study (perceived quality, efficiency, understandability, satisfaction, perceived prejudice, degree of perceived discrimination, general moral orientation, general affinity for technology). We decided not to report on these measures either due to reasons of readability or because we had included them only for exploratory purposes.

Additional information

Funding

No funds, grants, or other support was received.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 446.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.