Better understanding human visual attention during reading can provide valuable insights for developing user-centred computations models. A considerable amount of data, presented in a tabular form, is used in daily activities and is available on the Web nowadays. Several approaches have proposed an automated table summarisation method to improve the users' experience and give them succinct summaries of tables. However, there has been little attention to considering user behaviour in the design of automated table summarisation. In this paper, we present the findings of an empirical study, where we investigate, with the help of standard User Experience tools (eye-tracking technology and surveys), how users approach the reading of a table. We focus on evaluating how the domain knowledge and interest of the users influence their comprehension, eventually identifying four possible user-profiles and their different information needs. In order to show the impact of our findings on the selection of the information to keep in summary, we present and release a tool that, in addition to supporting the development of similar experiments, allows checking the information presented in summary in the form of Resource Description Framework (RDF) triples, by exploiting the semantic annotation of the table.
Cremaschi, M., Barbato, J., Rula, A., Palmonari, M., Actis-Grosso, R. (2022). What Really Matters in a Table? Insights from a User Study. In Proceedings - 2022 IEEE/WIC/ACM International Joint Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2022 (pp.263-269). Institute of Electrical and Electronics Engineers Inc. [10.1109/WI-IAT55865.2022.00045].
What Really Matters in a Table? Insights from a User Study
Cremaschi M.;Rula A.;Palmonari M.;Actis-Grosso R.
2022
Abstract
Better understanding human visual attention during reading can provide valuable insights for developing user-centred computations models. A considerable amount of data, presented in a tabular form, is used in daily activities and is available on the Web nowadays. Several approaches have proposed an automated table summarisation method to improve the users' experience and give them succinct summaries of tables. However, there has been little attention to considering user behaviour in the design of automated table summarisation. In this paper, we present the findings of an empirical study, where we investigate, with the help of standard User Experience tools (eye-tracking technology and surveys), how users approach the reading of a table. We focus on evaluating how the domain knowledge and interest of the users influence their comprehension, eventually identifying four possible user-profiles and their different information needs. In order to show the impact of our findings on the selection of the information to keep in summary, we present and release a tool that, in addition to supporting the development of similar experiments, allows checking the information presented in summary in the form of Resource Description Framework (RDF) triples, by exploiting the semantic annotation of the table.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.