Publication Details
Analyzing Comprehension of Grouped Bar and Line Charts Through Temporal and Physiological Sensing
master thesis
Status | open |
Student | N/A |
Advisor | Kathrin Schnizer |
Professor | Prof. Dr. Sven Mayer |
Task
Description
In the field of visualization comprehension, existing work focuses on evaluating viewers' cued and uncued responses to data visualizations [1-6]. While they explored the effect of visualization type [6-8], task type [7,8], and the amount of data included in the visualization [9] on users' performance, within their approach, they varied visualizations to an extent that the results cannot be reliably compared to determine the precise impact factors responsible for changes in performance. Furthermore, existing research largely relies on the correctness of responses to measure performance, which can be prone to guessing and fails to provide insights into users' cognitive processes during the examination of the visualizations. Our research aims to systematically and quantitatively evaluate viewers' responses to controlled variations of data visualizations, identifying key impact factors that influence comprehension. These insights will enable predictions regarding users' understanding of data visualizations and help optimize visualization design for enhanced comprehension. This thesis project aims to explore and quantify human comprehension of grouped bar charts and grouped line charts by leveraging temporal measures and physiological sensing. Specifically, the purpose of this study is to measure presentation times, EEG signals, and gaze data for controlled variations of these visualization types, identifying indicators linked to a viewer's comprehension performance. The research is structured around three key stages: (1) expanding an existing automated data and chart generation system to include grouped bar and line charts, (2) designing and implementing an experiment to collect EEG, gaze, and presentation times, and (3) conducting a user study and analyzing the results. The insights gained from this research could enhance our understanding of how individuals process different variations of visualizations and contribute to the development of a prediction model for measuring visualization comprehension. Ultimately, the findings may guide the creation of more effective and user-friendly data visualizations.
You will
- Perform a literature review
- Expand an existing automated data and chart generation system to include grouped bar and line charts
- Design and implement the experiment, including the physiological data collection
- Conduct the described study
- Extract and analyze relevant data from brain activity (EEG), gaze, and temporal data using statistics
- Summarize your findings in a thesis and present them to an audience
- (Optional) co-writing a research paper
You need
- Strong communication skills in English
- Good knowledge of Python for statistics.
References
- [1] S. Lee, S.-H. Kim, and B. C. Kwon, âVLAT: Development of a Visualization Literacy Assessment Test,â IEEE Trans. Vis. Comput. Graph., vol. 23, no. 1, pp. 551â560, Jan. 2017, doi: 10.1109/TVCG.2016.2598920.
- [2] S. Pandey and A. Ottley, âMini-VLAT: A Short and Effective Measure of Visualization Literacy,â Comput. Graph. Forum, vol. 42, no. 3, pp. 1â11, 2023, doi: 10.1111/cgf.14809.
- [3] L. W. Ge, Y. Cui, and M. Kay, âCALVI: Critical Thinking Assessment for Literacy in Visualizations,â in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, in CHI â23. New York, NY, USA: Association for Computing Machinery, Apr. 2023, pp. 1â18. doi: 10.1145/3544548.3581406.
- [4] Y. Cui, L. W. Ge, Y. Ding, F. Yang, L. Harrison, and M. Kay, âAdaptive Assessment of Visualization Literacy,â Aug. 27, 2023, arXiv: arXiv:2308.14147. Accessed: Aug. 27, 2024. [Online]. Available: http://arxiv.org/abs/2308.14147
- [5] J. Boy, R. A. Rensink, E. Bertini, and J.-D. Fekete, âA Principled Way of Assessing Visualization Literacy,â IEEE Trans. Vis. Comput. Graph., vol. 20, no. 12, pp. 1963â1972, Dec. 2014, doi: 10.1109/TVCG.2014.2346984.
- [6] G. J. Quadri, A. Z. Wang, Z. Wang, J. Adorno, P. Rosen, and D. A. Szafir, âDo You See What I See? A Qualitative Study Eliciting High-Level Visualization Comprehension,â in Proceedings of the CHI Conference on Human Factors in Computing Systems, in CHI â24. New York, NY, USA: Association for Computing Machinery, May 2024, pp. 1â26. doi: 10.1145/3613904.3642813.
- [7] S. Lee, B. C. Kwon, J. Yang, B. C. Lee, and S.-H. Kim, âThe Correlation between Usersâ Cognitive Characteristics and Visualization Literacy,â Appl. Sci., vol. 9, no. 3, Art. no. 3, Jan. 2019, doi: 10.3390/app9030488.
- [8] C. Nobre, K. Zhu, E. Mörth, H. Pfister, and J. Beyer, âReading Between the Pixels: Investigating the Barriers to Visualization Literacy,â in Proceedings of the CHI Conference on Human Factors in Computing Systems, in CHI â24. New York, NY, USA: Association for Computing Machinery, May 2024, pp. 1â17. doi: 10.1145/3613904.3642760.
- [9] J. Talbot, V. Setlur, and A. Anand, âFour Experiments on the Perception of Bar Charts,â IEEE Trans. Vis. Comput. Graph., vol. 20, no. 12, pp. 2152â2160, Dec. 2014, doi: 10.1109/TVCG.2014.2346320.