Human intervention in automated decision-making

Toward the construction of contestable systems

Marco Almada, BSCS, MCompEng

Researcher, Lawgorithm

LL.B. student, University of São Paulo Law School

me@marcoalmada.com

Lawgorithm USP Law School

Overview

  • What is human intervention?
  • When is a decision subject to intervention?
  • What is the purpose of human intervention?
  • How may intervention fail?
  • Designing contestable systems

Automated decision-making and data protection laws

GDPR Article 22(1), with our highlights:

The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

  • Article 22(2) provide some exceptions.

The right to human intervention

GDPR Article 22(3), with our highlights:

  1. In the cases referred to in points (a) and ( c ) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.

Decisions based solely on automated data processing

Clear-cut case: automated decision-making

  • No humans present in the decision loop
  • Automated decision-making, however, should be understood as a shorthand and not an exhaustive description
  • Some decisions can be based solely on automated data processing even if there are humans involved

Humans based solely on automated data processing

  • Example: rubber-stamping (see, e.g. Brkan 2017)
    • Algorithm might provide information and choices to a human
    • That human decider simply chooses the best-ranked option
  • Excluding this sort of decision from the scope of the right to human intervention would open space for loopholes

A more complicated case

  • Instead of rubber-stamping, the human decider now makes a deliberate choice between scenarios, using their knowledge.
  • Is this still a decision based solely on automated data processing?
    • Yes, if the decider only relies on factual information from the algorithm
    • Decider cannot alter the content of a decision: choose your own adventure.

The purpose of human intervention

  • The previous interpretation provides a broad reach to intervention.
  • But why would it be interesting to do so?
    • Intervention as quality control for decisions (e.g. the Petrov incident)
    • Intervention as a defence of human dignity (cf. Hildebrandt 2019)
  • In both cases, intervention is valuable to the extent that it protects rights, liberties, and interests.

Modes of failure for human intervention

  • Data subjects might not be able to request intervention
    • Lack of information (see Ohm 2018)
    • Lack of means
  • Intervention failures
    • Ineffective intervention
    • Harmful intervention

Requesting human intervention

To request an intervention, data subjects must:

  • Know that they are affected by an automated decision
  • Know how they are being affected
  • Have adequate means for requesting intervention

Design approaches might be used to ensure these goals

Replacing machines with humans

  • In many cases, a trustworthy, competent human could probably lead to better results than automated systems.
  • How to avoid a biased or incompentent intervenor?
    • Short run: individual liability
    • Long run: applying the same standards applied to automated decision bar applied to automated decisions

Contestability by design

  • Building contestable systems can be difficult
  • Why bother, then?
    • Ethical requirement
    • Legal requirement: GDPR Articles 22(3) and 25(1)

GDPR Article 25(1)

[…] the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures (…) which are designed to implement data-protection principles (…) and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

Contestability by design (CbD) and privacy by design (PbD)

  • Designing contestable systems cannot be subsumed into Privacy by Design
    • PbD directly protects a value
    • CbD establishes an instrument
    • CbD may benefit from PbD
    • …but they may also clash

Closing remarks

  • Human intervention creates new informational and organizational requirements for automated decision-making systems.
  • The technical solutions mentioned here have no claim to exhausting the theme.
    • Rather, they are meant to show how those requirements might be tackled from the first stages of system design.
    • Need to draw from new and established approaches.

Thank you!