In recent years, artificial intelligence systems have been increasingly deployed by private firms in their hiring processes. Whether these tools are used for parsing CVs, for conducting and analyzing interviews, or for assessing and scoring candidates, they all are cause for serious concerns about their fairness and accountability. AI hiring systems might in fact be biased and, as a result, they might result in discriminatory hiring practices against protected groups of jobseekers. At the same time, these tools might rely on arbitrary criteria to assess job applicants and, even without taking into account any discrimination ground, and therefore unduly penalize jobseekers with regard to their specific socio-economical condition. What is more, hiring processes based on AI can be more pervasive, opaque, and difficult to access by individual workers and trade unions, thus causing the few existing rules applicable to personnel selection, a pre-contractual relationship typically left to the discretion of the employer, to completely fall short of their aim. Nonetheless, unfair hiring processes directly restrict the right to access employment and impact other social rights connected to the condition of employment. Is there any possible strategy for rejected jobseekers, or trade unions representing them, to challenge an unfair hiring algorithm? The question is largely neglected in legal literature and deserves proper attention, especially since intermediation in today’s labour markets is being increasingly outsourced to automated decision-making systems. Building on existing legislation, case law, and collective agreements, the proposed paper explores possible new paths for individual workers, especially exploring the partially untapped potential of GDPR, since all the problematic AI systems need personal data to work. Relevant data protection rights include the right to rectification (art. 16), the right to erasure (art. 17), and the right not to be subject to a decision based solely on automated processing (art. 22). In the end, it will be possible to draw some tentative conclusions: first, the GDPR establish some core data protection rights that are crucial for jobseekers to make sure that hiring processes are fair and non-discriminatory; second, existing enforcement mechanisms can be used strategically to achieve better equality and to effectively access equal employment opportunities; third, trade unions should be involved in the steering of hiring processes and could use their information and consultation rights to assist jobseekers.

Bordoni, F. (2024). Using GDPR to counter unfair hiring algorithms: strategies for jobseekers and trade unions. Intervento presentato a: 2024 Future of Work Conference, Bruxelles, Belgium.

Using GDPR to counter unfair hiring algorithms: strategies for jobseekers and trade unions

Bordoni, F
2024

Abstract

In recent years, artificial intelligence systems have been increasingly deployed by private firms in their hiring processes. Whether these tools are used for parsing CVs, for conducting and analyzing interviews, or for assessing and scoring candidates, they all are cause for serious concerns about their fairness and accountability. AI hiring systems might in fact be biased and, as a result, they might result in discriminatory hiring practices against protected groups of jobseekers. At the same time, these tools might rely on arbitrary criteria to assess job applicants and, even without taking into account any discrimination ground, and therefore unduly penalize jobseekers with regard to their specific socio-economical condition. What is more, hiring processes based on AI can be more pervasive, opaque, and difficult to access by individual workers and trade unions, thus causing the few existing rules applicable to personnel selection, a pre-contractual relationship typically left to the discretion of the employer, to completely fall short of their aim. Nonetheless, unfair hiring processes directly restrict the right to access employment and impact other social rights connected to the condition of employment. Is there any possible strategy for rejected jobseekers, or trade unions representing them, to challenge an unfair hiring algorithm? The question is largely neglected in legal literature and deserves proper attention, especially since intermediation in today’s labour markets is being increasingly outsourced to automated decision-making systems. Building on existing legislation, case law, and collective agreements, the proposed paper explores possible new paths for individual workers, especially exploring the partially untapped potential of GDPR, since all the problematic AI systems need personal data to work. Relevant data protection rights include the right to rectification (art. 16), the right to erasure (art. 17), and the right not to be subject to a decision based solely on automated processing (art. 22). In the end, it will be possible to draw some tentative conclusions: first, the GDPR establish some core data protection rights that are crucial for jobseekers to make sure that hiring processes are fair and non-discriminatory; second, existing enforcement mechanisms can be used strategically to achieve better equality and to effectively access equal employment opportunities; third, trade unions should be involved in the steering of hiring processes and could use their information and consultation rights to assist jobseekers.
abstract + slide
AI, hiring, right to work, discrimination, bias, GDPR, data protection
English
2024 Future of Work Conference
2024
2024
reserved
Bordoni, F. (2024). Using GDPR to counter unfair hiring algorithms: strategies for jobseekers and trade unions. Intervento presentato a: 2024 Future of Work Conference, Bruxelles, Belgium.
File in questo prodotto:
File Dimensione Formato  
Bordoni-2024-2024 Future of Work Conference.pdf

Solo gestori archivio

Descrizione: Intervento a convegno - presentazione
Tipologia di allegato: Other attachments
Licenza: Tutti i diritti riservati
Dimensione 734.67 kB
Formato Adobe PDF
734.67 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/462640
Citazioni
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
Social impact