Department for Informatics | Sitemap | LMU-Portal
Deutsch
  • Home
  • Future Students
  • Enrolled students
  • Teaching
    • Archive
      • Detail
      • Padabama
      • Presentations
      • Publikationen
      • Themen
  • Research
  • People
  • Contact
  • Visitors
  • Jobs
  • FAQ
  • Internal
Home > Teaching > Archive > Detail

Investigating Gaze Estimation Accuracy in Collaborative Virtual Environments (CVEs)

BT/MT

Status open
Student N/A
Advisor Jesse Grootjen, Prof. Dr. Sven Mayer
Professor Prof. Dr. Sven Mayer

Task

Description

Project Overview This thesis project offers an exciting opportunity for students to contribute to cutting-edge research on gaze estimation in interactive systems. The focus is on enhancing the accuracy of gaze interpretation within Collaborative Virtual Environments (CVEs), where effective communication is often dependent on understanding where participants are looking. Gaze serves as a vital non-verbal communication cue, yet people frequently struggle to accurately determine another persons gaze direction (i.e., where someone is looking), especially over distances.

Project Motivation

In CVEs, precise gaze estimation is crucial for natural and effective interaction. While previous research has explored distant pointing as an interaction mechanism, this project shifts focus to gaze estimation. By addressing common inaccuracies in gaze prediction, this research aims to significantly improve how users interpret each others gaze during virtual interactions, ultimately enhancing the overall immersive experience.

Project Goals

This thesis will investigate how accurately gaze estimation can be performed in CVEs, focusing on two main aspects:
  • 1. Gaze Estimation Experiments: Participants will perform gaze tasks directed at targets on a screen from two different distances. The data collected will help evaluate the performance of current gaze estimation methods in these scenarios.
  • 2. Model Development: Using the insights from distant pointing research, the project aims to develop a mathematical model to correct (potential) systematic displacements in gaze estimation.

You will

  • Perform a literature review
  • Modify an existing VR environment
  • Implement an preprocessing pipeline for eye-tracking data
  • Collect and analyze eye-tracking data, focussing on developing a model to correct potential systematic displacement in gaze estimation
  • Summarize your findings in a thesis and present them to an audience
  • (Optional) co-writing a research paper

You need

  • Strong communication skills in English
  • Good knowledge of Unity

References

  • [1] Schweigert, R., Schwind, V., & Mayer, S. (2019). EyePointing: A gaze-based selection technique. In Proceedings of Mensch und Computer 2019. ACM. https://doi.org/10.1145/3340764.3344897
  • [2] Mayer, S., Schwind, V., Schweigert, R., & Henze, N. (2018). The effect of offset correction and cursor on mid-air pointing in real and virtual environments. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 653:1–653:13). ACM. https://doi.org/10.1145/3173574.3174227
  • [3] Mayer, S., Wolf, K., Schneegass, S., & Henze, N. (2015). Modeling distant pointing for compensating systematic displacements. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 4165–4168). ACM. https://doi.org/10.1145/2702123.2702332

Keywords

VR, Gaze, Opti Track
To top
Impressum – Privacy policy – Contact  |  Last modified on 2020-04-11 by Changkun Ou (rev 35667)