While accurate AI systems can enhance human performance, exerting both an augmentation and good mentoring effect, imperfect systems may act as poor mentors, transmitting biases and systematic errors to users. However, there is still limited research on the potential for AI to transmit biases to humans, an effect that could be even more pronounced for less experienced users, such as novices or trainees, making decisions supported by AI-based systems. To investigate the bias transmission effect and the potential of AIto serve as a mentor, we involved eighty-six medical students, dividing them into an AI-assisted group and a control group. We tasked them with classifying simulated tissue samples fora fictitious disease. In the first phase of the task, the AI group received diagnostic advice from a simulated AI system that made systematic errors for a specific type of case, while being accurate for all other types. The control group did not receive any assistance. In the second phase, participants in both groups classified new tissue samples, including ambiguous cases, without any support to test the residual impact of AI bias. The results showed that the AI-assisted group exhibited a higher error rate when classifying cases where the AI provided systematically erroneous advice, both in the AI-assisted and the subsequent unassisted phase, suggesting the persistence of AI-induced bias. Our study emphasizes the need for careful implementation and continuous evaluation of AI systems in education and training to mitigate potential negative impacts on trainee learning outcomes.
Vicente, L., Matute, H., Fregosi, C., Cabitza, F. (2025). Machine learning systems as mentors in human learning: A user study on machine bias transmission in medical training. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, 198(April 2025) [10.1016/j.ijhcs.2025.103474].
Machine learning systems as mentors in human learning: A user study on machine bias transmission in medical training
Fregosi C.;Cabitza F.
2025
Abstract
While accurate AI systems can enhance human performance, exerting both an augmentation and good mentoring effect, imperfect systems may act as poor mentors, transmitting biases and systematic errors to users. However, there is still limited research on the potential for AI to transmit biases to humans, an effect that could be even more pronounced for less experienced users, such as novices or trainees, making decisions supported by AI-based systems. To investigate the bias transmission effect and the potential of AIto serve as a mentor, we involved eighty-six medical students, dividing them into an AI-assisted group and a control group. We tasked them with classifying simulated tissue samples fora fictitious disease. In the first phase of the task, the AI group received diagnostic advice from a simulated AI system that made systematic errors for a specific type of case, while being accurate for all other types. The control group did not receive any assistance. In the second phase, participants in both groups classified new tissue samples, including ambiguous cases, without any support to test the residual impact of AI bias. The results showed that the AI-assisted group exhibited a higher error rate when classifying cases where the AI provided systematically erroneous advice, both in the AI-assisted and the subsequent unassisted phase, suggesting the persistence of AI-induced bias. Our study emphasizes the need for careful implementation and continuous evaluation of AI systems in education and training to mitigate potential negative impacts on trainee learning outcomes.| File | Dimensione | Formato | |
|---|---|---|---|
|
Vicente et al-2025-International Journal of Human-Computer Studies-VoR.pdf
accesso aperto
Tipologia di allegato:
Author’s Accepted Manuscript, AAM (Post-print)
Licenza:
Creative Commons
Dimensione
1.44 MB
Formato
Adobe PDF
|
1.44 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


