Effortless reading remains an issue for many Web users, despite a large number of readability guidelines available to designers. This paper presents a study of manual and automatic use of 39 readability guidelines in webpage evaluation. The study collected the ground-truth readability for a set of 50 webpages using eye-tracking with average and dyslexic readers (n = 79). It then matched the ground truth against human-based (n = 35) and automatic evaluations. The results validated 22 guidelines as being connected to readability. The comparison between human-based and automatic results also revealed a complex framework: algorithms were better or as good as human experts at evaluating webpages on specific guidelines – particularly those about low-level features of webpage legibility and text formatting. However, multiple guidelines still required a human judgment related to understanding and interpreting webpage content. These results contribute a guideline categorization laying the ground for future design evaluation methods.

Miniukovich, A., Scaltritti, M., Sulpizio, S., De Angeli, A. (2019). Guideline-based evaluation of web readability. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery [10.1145/3290605.3300738].

Guideline-based evaluation of web readability

Simone Sulpizio;
2019

Abstract

Effortless reading remains an issue for many Web users, despite a large number of readability guidelines available to designers. This paper presents a study of manual and automatic use of 39 readability guidelines in webpage evaluation. The study collected the ground-truth readability for a set of 50 webpages using eye-tracking with average and dyslexic readers (n = 79). It then matched the ground truth against human-based (n = 35) and automatic evaluations. The results validated 22 guidelines as being connected to readability. The comparison between human-based and automatic results also revealed a complex framework: algorithms were better or as good as human experts at evaluating webpages on specific guidelines – particularly those about low-level features of webpage legibility and text formatting. However, multiple guidelines still required a human judgment related to understanding and interpreting webpage content. These results contribute a guideline categorization laying the ground for future design evaluation methods.
paper
Accessibility; Design guidelines; User experience; WCAG 2.1; Web design;
English
2019 CHI Conference on Human Factors in Computing Systems, CHI 2019 MAY 04-09
2019
Conference on Human Factors in Computing Systems - Proceedings
9781450359702
2019
reserved
Miniukovich, A., Scaltritti, M., Sulpizio, S., De Angeli, A. (2019). Guideline-based evaluation of web readability. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery [10.1145/3290605.3300738].
File in questo prodotto:
File Dimensione Formato  
MiniukovichEtAl_CHI2019.pdf

Solo gestori archivio

Tipologia di allegato: Author’s Accepted Manuscript, AAM (Post-print)
Dimensione 881.77 kB
Formato Adobe PDF
881.77 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/299676
Citazioni
  • Scopus 30
  • ???jsp.display-item.citation.isi??? 16
Social impact