Creating an EEG and Eye Tracking Dataset for Viewing Different Charts Depicting Univariate Categorical Data
bachelor thesis
Status | in progress |
Student | Andra Lazar |
Advisor | Kathrin Schnizer |
Professor | Prof. Dr. Albrecht Schmidt |
Task
Description
Understanding how people interpret different chart types is crucial for designing effective data visualizations. While previous research has explored factors such as visualization type [6-8], task type [7,8], and data complexity [9], many studies varied visualization conditions in ways that make it difficult to pinpoint the precise factors influencing comprehension. Additionally, existing approaches often rely on correctness as the primary performance measure, which does not capture the cognitive processes involved in interpreting a visualization. This thesis aims to systematically investigate how users comprehend univariate categorical chartsâincluding pie charts, bar charts, and stacked bar chartsâby leveraging EEG and eye-tracking data. Specifically, the study will track gaze patterns, completion times, and brain activity to identify cognitive indicators linked to comprehension performance.
Research Phases
The research consists of three main phases:
- Extending the chart generation system: Enhance an existing automated data and chart generation system to support pie charts, bar charts, and stacked bar charts.
- Developing an interactive display system: Create a system that presents generated charts alongside a series of questions, records user responses via free-text entry, and captures confidence ratings through a slider.
- Conducting a user study and data analysis: Collect EEG, gaze, and response data, then analyze it to uncover differences in user performance and physiological responses across chart types.
By analyzing stimulus-response relationships and physiological signals, this research aims to enhance our understanding of how different visualization types influence user cognition and performance, contributing to improved visualization design.
You Will
- Perform a literature review on visualization comprehension and physiological data collection.
- Extend an existing automated data and chart generation system to support pie charts, bar charts, and stacked bar charts.
- Develop an interactive display system that:
- Presents a series of questions, each followed by a generated chart.
- Records user input via free-text entry and confidence ratings through a slider.
- Tracks completion time, EEG data, and eye-gaze patterns during interactions.
- Design and conduct a user study to collect EEG, gaze, and response data.
- Transform collected data into a structured dataset containing stimulus-response relationships, presentation times, and physiological signals.
- Analyze the dataset to identify differences in user performance and physiological responses across different chart types.
- Summarize your findings in a thesis and present them to an audience.
- (Optional) Contribute to co-writing a research paper.
You Need
- Strong communication skills in English.
- Good knowledge of Python for the chart generation implementation as well as the statistical analysis.
References
- [1] S. Lee, S.-H. Kim, and B. C. Kwon, âVLAT: Development of a Visualization Literacy Assessment Test,â IEEE Trans. Vis. Comput. Graph., vol. 23, no. 1, pp. 551â560, Jan. 2017, doi: 10.1109/TVCG.2016.2598920.
- [2] S. Pandey and A. Ottley, âMini-VLAT: A Short and Effective Measure of Visualization Literacy,â Comput. Graph. Forum, vol. 42, no. 3, pp. 1â11, 2023, doi: 10.1111/cgf.14809.
- [3] L. W. Ge, Y. Cui, and M. Kay, âCALVI: Critical Thinking Assessment for Literacy in Visualizations,â in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, in CHI â23. New York, NY, USA: Association for Computing Machinery, Apr. 2023, pp. 1â18. doi: 10.1145/3544548.3581406.
- [4] Y. Cui, L. W. Ge, Y. Ding, F. Yang, L. Harrison, and M. Kay, âAdaptive Assessment of Visualization Literacy,â Aug. 27, 2023, arXiv: arXiv:2308.14147. Accessed: Aug. 27, 2024. [Online]. Available: http://arxiv.org/abs/2308.14147
- [5] J. Boy, R. A. Rensink, E. Bertini, and J.-D. Fekete, âA Principled Way of Assessing Visualization Literacy,â IEEE Trans. Vis. Comput. Graph., vol. 20, no. 12, pp. 1963â1972, Dec. 2014, doi: 10.1109/TVCG.2014.2346984.
- [6] G. J. Quadri, A. Z. Wang, Z. Wang, J. Adorno, P. Rosen, and D. A. Szafir, âDo You See What I See? A Qualitative Study Eliciting High-Level Visualization Comprehension,â in Proceedings of the CHI Conference on Human Factors in Computing Systems, in CHI â24. New York, NY, USA: Association for Computing Machinery, May 2024, pp. 1â26. doi: 10.1145/3613904.3642813.
- [7] S. Lee, B. C. Kwon, J. Yang, B. C. Lee, and S.-H. Kim, âThe Correlation between Usersâ Cognitive Characteristics and Visualization Literacy,â Appl. Sci., vol. 9, no. 3, Art. no. 3, Jan. 2019, doi: 10.3390/app9030488.
- [8] C. Nobre, K. Zhu, E. Mörth, H. Pfister, and J. Beyer, âReading Between the Pixels: Investigating the Barriers to Visualization Literacy,â in Proceedings of the CHI Conference on Human Factors in Computing Systems, in CHI â24. New York, NY, USA: Association for Computing Machinery, May 2024, pp. 1â17. doi: 10.1145/3613904.3642760.
- [9] J. Talbot, V. Setlur, and A. Anand, âFour Experiments on the Perception of Bar Charts,â IEEE Trans. Vis. Comput. Graph., vol. 20, no. 12, pp. 2152â2160, Dec. 2014, doi: 10.1109/TVCG.2014.2346320.