• eyetracking@ethz.ch
  • +41 44 633 71 59

Research

We investigate visual attention while interacting with geo information and in spatial decision situations. Eye tracking technology is used in order to understand users, predict their behavior, and assist them in their spatial activities. We combine competences and methods from Geographic Information Science, Computer Science, and Human Computer Interaction. We currently focus on the following three application areas:

Gaze-Informed LBS

Gaze-Informed Location-Based Services (GAIN-LBS) are LBS that consider the user's gaze as one type of context information.

Gaze-Based Geographic HCI

Gaze-based interaction with digital maps or other types of geographic information visualizations.

Spatial Awareness in Aviation

Improving a pilot’s spatial awareness by enhancing flight operations and pilot training with gaze-based interactions.

News

Winter School Updates

Keynote and travel grants

Read More

Tianyi Xiao joins the team

3D Sketch Maps project

Read More

Winter School 2023

Registration is now open!

Read More

Participation in COSIT 2022

We are excited to be part of the 15th International Conference on Spatial Information Theory (COSIT 2022) that is taking place on September 5-9, 2022, in Kobe, Japan. Two of our lab members, Kevin Kim and Adrian Sarbach, will attend the conference (in person!) and present our latest work. We are looking forward to meeting

Read More

FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight

Our article “FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight” has been accepted for publication by the International Journal of Human–Computer Interaction (IJHCI)

Read More

Co-authored paper on PEGGASUS at SPIE Photonics West

A joint paper with our partners at CSEM on the eye tracking hard- and software developed in PEGGASUS was published on the SPIE digital library. We encourage you to take a deep dive into the technological achievements of PEGGASUS: “The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at

Read More