File Download
Supplementary
-
Citations:
- Appears in Collections:
Others: Having Your Day in Robot Court
Title | Having Your Day in Robot Court |
---|---|
Authors | |
Keywords | Artificial intelligence Procedural justice Courts Judges Algorithms |
Issue Date | 2022 |
Publisher | University of Hong Kong, Faculty of Law. |
Citation | University of Hong Kong Faculty of Law Research Paper, 2021/020 How to Cite? |
Abstract | Should machines be judges? Some say no, arguing that citizens would see robot-led legal proceedings as procedurally unfair because “having your day in court” is having another human adjudicate your claims. Prior research established that people obey the law in part because they see it as procedurally just. The introduction of artificially intelligent (AI) judges could therefore undermine sentiments of justice and legal compliance if citizens intuitively take machine-adjudicated proceedings to be less fair than the human-adjudicated status quo. Two original experiments show that ordinary people share this intuition. There is a perceived “human-AI fairness gap.”
However, it is also possible to reduce — and perhaps even eliminate — the fairness gap through “algorithmic offsetting.” Affording a hearing before AI judges and enhancing the interpretability of AI-rendered decisions reduce the human-AI fairness gap. Moreover, the procedural justice advantage of a human over AI appears to be driven more by beliefs about the accuracy of the outcome and thoroughness of consideration, rather than doubts about whether a party felt it had a good opportunity to voice its opinions or whether the judge understood the perspective of the litigant.
The results support a common and fundamental objection to robot judges: There is a concerning human-AI fairness gap. Yet, the results also indicate that the strongest version of this challenge — human judges have irreducible procedural fairness advantages — is not reflected in public views. In some circumstances, people see a day in robot court as no less fair than a day in human court. |
Persistent Identifier | http://hdl.handle.net/10722/323476 |
SSRN |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chen, MB | - |
dc.contributor.author | Stremitzer, A | - |
dc.contributor.author | Tobia, K | - |
dc.date.accessioned | 2023-01-06T06:48:04Z | - |
dc.date.available | 2023-01-06T06:48:04Z | - |
dc.date.issued | 2022 | - |
dc.identifier.citation | University of Hong Kong Faculty of Law Research Paper, 2021/020 | - |
dc.identifier.uri | http://hdl.handle.net/10722/323476 | - |
dc.description.abstract | Should machines be judges? Some say no, arguing that citizens would see robot-led legal proceedings as procedurally unfair because “having your day in court” is having another human adjudicate your claims. Prior research established that people obey the law in part because they see it as procedurally just. The introduction of artificially intelligent (AI) judges could therefore undermine sentiments of justice and legal compliance if citizens intuitively take machine-adjudicated proceedings to be less fair than the human-adjudicated status quo. Two original experiments show that ordinary people share this intuition. There is a perceived “human-AI fairness gap.” However, it is also possible to reduce — and perhaps even eliminate — the fairness gap through “algorithmic offsetting.” Affording a hearing before AI judges and enhancing the interpretability of AI-rendered decisions reduce the human-AI fairness gap. Moreover, the procedural justice advantage of a human over AI appears to be driven more by beliefs about the accuracy of the outcome and thoroughness of consideration, rather than doubts about whether a party felt it had a good opportunity to voice its opinions or whether the judge understood the perspective of the litigant. The results support a common and fundamental objection to robot judges: There is a concerning human-AI fairness gap. Yet, the results also indicate that the strongest version of this challenge — human judges have irreducible procedural fairness advantages — is not reflected in public views. In some circumstances, people see a day in robot court as no less fair than a day in human court. | - |
dc.language | eng | - |
dc.publisher | University of Hong Kong, Faculty of Law. | - |
dc.relation.ispartof | University of Hong Kong Faculty of Law Research Paper | - |
dc.subject | Artificial intelligence | - |
dc.subject | Procedural justice | - |
dc.subject | Courts | - |
dc.subject | Judges | - |
dc.subject | Algorithms | - |
dc.title | Having Your Day in Robot Court | - |
dc.type | Others | - |
dc.identifier.email | Chen, MB: benched@hku.hk | - |
dc.identifier.email | Stremitzer, A: a_stremitzer@yahoo.com | - |
dc.identifier.authority | Chen, MB=rp02689 | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.hkuros | 700004140 | - |
dc.publisher.place | Hong Kong, China | - |
dc.identifier.ssrn | 3841534 | - |
dc.identifier.hkulrp | 2021/020 | - |