Automated Decision Making and right to explanation. The right of access as ex post information.

Emiliano Troisi

Abstract


The data sets, the processes that determine an algorithmic decision, the rationale of a certain automated decision affecting the legal sphere of a natural person should be traceable, transparent, explained; this is also in order to enable the individual affected to challenge the contents of an unfair decision. Instead, they are rarely so: either by choice - for reasons of competition, of protection of know-how - or because of technological limitations: this is the case of those algorithms that appropriately are referred to as 'black-box'; systems whose inferential mechanisms are not (completely) predictable ex ante or which, in any case, do not always make it possible to explain why an automated decision-making model has generated a particular outcome (and what combination of factors contributed to it). Having affirmed the existence of an ethical duty to transparency of the algorithm and explanation of the (individual) decision reached by automated means, this Paper wonders whether there is a corresponding right on the level of positive law - and what are its limits, of legal but also technological nature. Critically drawing on the most important scholarly opinions on the subject, the right to explanation of the automated decision-making is identified, in the context of the GDPR, in the right of access under Article 15 of the Regulation.

Keywords


Automated Decision Making; algorithm; opacity; explanation; right of access; Gdpr.

Full Text:

PDF

Refbacks

  • There are currently no refbacks.


Iscrizione al R.O.C. n. 25223
Registro Stampa presso il Tribunale di Napoli, n. 48 del 03.12.2019
R.G. n. 8014/19