• eyetracking@ethz.ch
  • +41 44 633 71 59

News

  • -

  • -

Book Chapter on Outdoor HCI accepted

Kiefer, P., Adams, B., Kwok, T., Raubal, M. (2020) Modeling Gaze-Guided Narratives for Outdoor Tourism. In: McCrickard, S., Jones, M., and Stelter, T. (eds.): HCI Outdoors: Theory, Design, Methods and Applications. Springer International Publishing (in print)


  • -

ET4S and ETVIS proceedings

We’ve been involved in the organization of two co-located events at this year’s ETRA conference: Eye Tracking for Spatial Research (ET4S) and Eye Tracking and Visualization (ETVIS). Even though ETRA and all co-located events had to be canceled, the review process was finished regularly, and accepted papers are now available in the ETRA Adjunct Proceedings in the ACM Digital Library.

Accepted ET4S papers are also linked from the ET4S website.


  • -

New article on iAssyst

The instructor assistant system (iAssyst) that we developed as part of our research collaboration with Swiss International Air Lines is being featured in an article by innoFRAtor, the innovation portal of the Fraport AG.

You may read more about the system in our related research article: Rudi D., Kiefer P., and Raubal M. (2020). The Instructor Assistant System (iASSYST) – Utilizing Eye Tracking for Commercial Aviation Training Purposes. Ergonomics, vol. 63: no. 1, pp. 61-​79, London: Taylor & Francis, 2020. DOI: https://doi.org/10.1080/00140139.2019.1685132

Our project on Enhanced flight training program for monitoring aircraft automation with Swiss International Air Lines, NASA, and the University of Oregon was officially concluded end of last year.


  • -

Workshop Paper published from ETAVI 2020

Our paper Towards Pilot-Aware Cockpits has been published in the proceedings of the 1st International Workshop on Eye-Tracking in Aviation (ETAVI 2020):

Lutnyk L., Rudi D., and Raubal M. (2020). Towards Pilot-​Aware Cockpits. In Proceedings of the 1st International Workshop on Eye-​Tracking in Aviation (ETAVI 2020), ETH Zurich. DOI: https://doi.org/10.3929/ethz-b-000407661

Abstract. Eye tracking has a longstanding history in aviation research. Amongst others it has been employed to bring pilots back “in the loop”, i.e., create a better awareness of the flight situation. Interestingly, there exists only little research in this context that evaluates the application of machine learning algorithms to model pilots’ understanding of the aircraft’s state and their situation awareness. Machine learning models could be trained to differentiate between normal and abnormal patterns with regard to pilots’ eye movements, control inputs, and data from other psychophysiological sensors, such as heart rate or blood pressure. Moreover, when the system recognizes an abnormal pattern, it could provide situation specific assistance to bring pilots back in the loop. This paper discusses when pilots benefit from such a pilot-aware system, and explores the technical and user oriented requirements for implementing this system.

Edit. The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

GeoGazeLab involved in Future Resilient Systems II programme

The second phase of the FRS programme at the Singapore-ETH Centre officially started on April 1st with an online research kick-off meeting. It was launched in the midst of a global crisis – COVID-​19, highlighting the need to better understand and foster resilience. Within FRS-II there is a particular emphasis on social resilience to enhance the understanding of how socio-​technical systems perform before, during and after disruptions.

GeoGazeLab researchers will contribute within a research cluster focusing on distributed cognition (led by Martin Raubal). More specifically, we will develop a visualization, interaction, and notification framework for communicating predicted disruptions to stakeholders. Empirical studies utilizing eye tracking and gaze-based interaction methods will be part of this project, which is led by Martin Raubal and Peter Kiefer.


  • -

Full Paper accepted at ETRA 2020

Our paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” has been accepted at ACM ETRA 2020 as a full paper:

Göbel, F., Kurzhals K., Schinazi V. R., Kiefer, P., and Raubal, M. (2020). Gaze-Adaptive Lenses for Feature-Rich Information Spaces. In Proceedings of the 12th ACM Symposium on Eye Tracking Research & Applications (ETRA ’20), ACM. DOI: https://doi.org/10.1145/3379155.3391323


  • -

Workshop Paper accepted at CHI 2020

Our workshop contribution “Gaze-Aware Mixed-Reality: Addressing Privacy Issues with Eye Tracking” has been accepted at the “Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI” at ACM CHI 2020:

Fabian Göbel, Kuno Kurzhals Martin Raubal and Victor R. Schinazi (2020). Gaze-Aware Mixed-Reality: Addressing Privacy Issues with Eye Tracking.
In CHI 2020 Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI (CHI 2020), ACM.


  • -

ETAVI 2020 Proceedings Online

The proceedings of the 1st International Workshop on Eye-Tracking in Aviation (ETAVI) 2020 have been published here.

We thank all program committee members for their efforts and great support, it is very much appreciated.

Furthermore, we thank all authors of the 15 excellent articles that were accepted for ETAVI 2020. We regret that the event had to be cancelled due to COVID-19.

Edit. Some of the organizers from ETH are part of the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Full Paper accepted at CHI 2020

Our paper “A View on the Viewer: Gaze-Adaptive Captions for Videos” has been accepted at ACM CHI 2020 as a full paper:

Kurzhals K., Göbel F., Angerbauer K., Sedlmair M., Raubal M. (2020) A View on the Viewer: Gaze-Adaptive Captions for Videos. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2020), ACM (accepted)

 

Abstract. Subtitles play a crucial role in cross-lingual distribution of multimedia content and help communicate information where auditory content is not feasible (loud environments, hearing impairments, unknown languages). Established methods utilize text at the bottom of the screen, which may distract from the video. Alternative techniques place captions closer to related content (e.g., faces) but are not applicable to arbitrary videos such as documentations. Hence, we propose to leverage live gaze as indirect input method to adapt captions to individual viewing behavior. We implemented two gaze-adaptive methods and compared them in a user study (n=54) to traditional captions and audio-only videos. The results show that viewers with less experience with captions prefer our gaze-adaptive methods as they assist them in reading. Furthermore, gaze distributions resulting from our methods are closer to natural viewing behavior compared to the traditional approach. Based on these results, we provide design implications for gaze-adaptive captions.


  • -

ET4S 2020 – 1st Call for Papers

The 5th edition of ET4S is taking place as a co-located event of ETRA 2020 in Stuttgart, Germany, between 2 and 5 June 2020.

The 1st Call for Papers is now available online.


  • -

Final project event – Enhanced flight training program for monitoring aircraft automation

On the 25th of November we presented the final results of our project entitled: “Enhanced flight training program for monitoring aircraft automation“.

The project was partially funded by the swiss Federal Office of Civil Aviation (BAZL), was lead by SWISS International Airlines Ltd., and supported by Prof. Dr. Robert Mauro from the Department of Psychology (University of Oregon) and Dr. Immanuel Barshi from NASA Ames Research Center, Human Systems Integration Division (NASA).

As part of this very successful project we developed a system that provides instructors with more detailed insights concerning pilots’ attention during training flights, to specifically improve instructors’ assessment of pilot situation awareness.

Our project has recently been featured in different media. ETH News has published a throrough article on our project: Tracking the eye of the pilot and different articles have been referencing this publication. Additionally, there have been two Radio interviews with Prof. Dr. Martin Raubal at Radio Zürisee and SRF4 (both in German).

Moreover, there have been publications during the course of the project:

Rudi, D., Kiefer, P. & Raubal, M. (2019). The Instructor Assistant System (iASSYST) – Utilizing Eye Tracking for Aviation Training Purposes. Ergonomics. https://doi.org/10.1080/00140139.2019.1685132.

Rudi, D., Kiefer, P., Giannopoulos, I., & Raubal, M. (2019). Gaze-based interactions in the cockpit of the future – a survey. Journal on Multimodal User Interfaces. Springer. Retrieve from https://link.springer.com/article/10.1007/s12193-019-00309-8.

Rudi D., Kiefer P. & Raubal, M. (2018). Visualizing Pilot Eye Movements for Flight Instructors. In ETVIS ’18: 3rd Workshop on Eye Tracking and Visualization. ACM, New York, NY, USA, Article 7, 5 pages. http://doi.acm.org/10.1145/3205929.3205934. Best Paper Award Winner.


  • -

Full paper published – Ergonomics Journal

Our paper “The instructor assistant system (iASSYST) – utilizing eye tracking for commercial aviation training purposes” has been published in the Ergonomics Journal:

Rudi, D., Kiefer, P., & Raubal, M. (2019). The instructor assistant system (iASSYST)-utilizing eye tracking for commercial aviation training purposes. Ergonomics.

Abstract. This work investigates the potential of providing commercial aviation flight instructors with an eye tracking enhanced observation system to support the training process. During training, instructors must deal with many parallel tasks, such as operating the flight simulator, acting as air traffic controllers, observing the pilots and taking notes. This can cause instructors to miss relevant information that is crucial for debriefing the pilots. To support instructors, the instructor ASsistant SYSTem (iASSYST) was developed. It includes video, audio, simulator and eye tracking recordings. iASSYST was evaluated in a study involving 7 instructors. The results show that with iASSYST, instructors were able to support their observations of errors, find new errors, determine that some previously identified errors were not errors, and to reclassify the types of errors that they had originally identified. Instructors agreed that eye tracking can help identifying causes of pilot error.


  • -

ETRA Call for Demos&Videos

We are again involved in the organization of ETRA 2020, the ACM Symposium on Eye Tracking Research & Applications, taking place in Stuttgart in June 2020.

As one of our activities at ETRA 2020, Peter is co-chairing the Demo&Video Track.  The Call for Demos&Videos is now online and open for submissions.


  • -

Open PhD position in Singapore

As part of our involvement in the upcoming Future Resilient Systems II research programme, we are looking for a PhD candidate working on the development of a visualization, interaction and notification framework for communicating disruptions predicted from weak signals.

Employment will be at the Singapore-ETH Centre, workplace Singapore.

More details and application here.


  • -

Victor Schinazi joins the team

We’re very happy to welcome Victor Schinazi as a new team member! He’ll be working with us for 4 months before joining the faculty of Psychology at Bond University in Australia.

[Current team]


  • -

PhD graduation David Rudi

David Rudi has successfully defended his doctoral thesis on 16 September (“Enhancing Spatial Awareness of Pilots in Commercial Aviation”). We cordially congratulate, and are happy that he’ll stay with us as a PostDoc starting from November!


  • -

Invited Talk by Sophie Stellmach on Mixed Reality on 10.10.2019

We are glad to announce an invited talk by Sophie Stellmach on Eye Tracking and Mixed Reality as part of the VERA Geomatics Seminar.

Dr. Sophie Stellmach is a Senior Scientist at Microsoft where she explores entirely new ways to engage with and blend our virtual and physical realities in products such as Microsoft HoloLens. Being an avid eye tracking researcher for over a decade, she was heavily involved in the development of gaze-based interaction with HoloLens 2.

The talk will take place as part of the VERA Geomatics Seminar on Thursday, 10th October 2019, 5:00 p.m. at ETH Hönggerberg, HIL D 53.
Title: Multimodal Gaze-supported Input in Mixed Reality and its Promises for Spatial Research


  • -

Visit by the Vice President for Research and Corporate Relations

On 11 September, Prof. Dr. Detlef Günther, the Vice President for Research and Corporate Relations of ETH Zurich, has visited the D-BAUG department and informed himself about the exciting research activities of the different institutes.

Our institute was represented by Peter Kiefer, who summarized the research of the GeoGazeLab. The slides provide an overview on our research interests and current projects.

Edit. The presentation includes the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

ETAVI – 2nd Call for Papers

A quick reminder for the “1st International Workshop on Eye-Tracking in Aviation (ETAVI)” that is going to take place March 2019 in Toulouse, France.

The submission deadlines are:

  • Abstracts: 9th September 2019
  • Paper: 30th September 2019

Feel free to also forward the Call for Papers to any interested colleagues.

We look forward to seeing you there!

Edit. Some of the organizers from ETH are part of the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Gaze-based interactions in the cockpit of the future: a survey

An article titled “Gaze-based interactions in the cockpit of the future: a survey” will appear in one of the next issues of the Journal on Multimodal User Interfaces. It is now available online:

Abstract Flying an aircraft is a mentally demanding task where pilots must process a vast amount of visual, auditory and vestibular information. They have to control the aircraft by pulling, pushing and turning different knobs and levers, while knowing that mistakes in doing so can have fatal outcomes. Therefore, attempts to improve and optimize these interactions should not increase pilots’ mental workload. By utilizing pilots’ visual attention, gaze-based interactions provide an unobtrusive solution to this. This research is the first to actively involve pilots in the exploration of gaze-based interactions in the cockpit. By distributing a survey among 20 active commercial aviation pilots working for an internationally operating airline, the paper investigates pilots’ perception and needs concerning gaze-based interactions. The results build the foundation for future research, because they not only reflect pilots’ attitudes towards this novel technology, but also provide an overview of situations in which pilots need gaze-based interactions.


  • -

PEGGASUS in the news

Our research project PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions) has attracted quite a bit of coverage in the media.

See for yourself in this little press review:

Edit. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

1st International Workshop on Eye Tracking in Aviation: Call for Papers

We’re glad to announce the 1st International Workshop on Eye Tracking in Aviation (ETAVI), which will take place March 17, 2020 in Toulouse, France.

The workshop aims to bring together researchers and practitioners who have a common interest in using eye tracking in the aviation domain including, but not limited to the cockpit and air traffic control / management.

The keynote will be given by Leonardo Di Stasi an assistant professor at the University of Granada (Spain) with an extensive research background in the field of Aviation and Eye Tracking.

The Call for Papers is now available (Paper submission deadline: September 30, 2019; Abstract submission deadline: September 9, 2019).


  • -

Invited Talk at the Royal Aeronautical Society

On June 12th, our colleagues from Swiss International Air Lines Ltd. and we had the honor to present our project on “Enhanced flight training program for monitoring aircraft automation” at the Royal Aeronautical Society, and received a lot of positive feedback for our project.

This year, the Spring Conference of the Royal Aeronautical Society Flight Simulation Group was on “The Future Reality of Flight Simulation” and featured a number of very interesting talks (Programme).

We thank our hosts for having us and plan to visit next year’s event as well.


  • -

ET4S and ETRA 2019: Impressions

Our group has organized the “Eye Tracking for Spatial Research” event as a track at this year’s ETRA conference in Denver, Colorado. It featured four full paper presentations, one short paper presentation, as well as an invited talk (see program). A dominant topic at this year’s ET4S was augmented/mixed/virtual reality. As a particular highlight, our invited speaker Sophie Stellmach (Senior Scientist at Microsoft) highlighted the fascinating opportunities of HoloLens 2, an upcoming mixed reality device that will have eye tracking capabilities included.

The GeoGazeLab was further involved with Fabian’s talk on “POI-Track: Improving Map-Based Planning with Implicit POI Tracking” and Kuno presenting his work on “Space-Time Volume Visualization of Gaze and Stimulus” in the ETRA main program. A paper co-authored by Martin was presented by one of his co-authors (“Eye Tracking Support for Visual Analytics Systems: Foundations, Current Applications, and Research Challenges”).

 

The invited talk by Sophie Stellmach (Microsoft) …

 

… attracted quite some audience.

 

Testing HoloLens 2 after ET4S.


  • -

ET4S 2019: Program and Invited Talk

The program of ET4S 2019, a track at ETRA, is now available on the website: http://et4s.ethz.ch/program/.

We’re excited to announce Sophie Stellmach (Senior Scientist @ Microsoft, HoloLens team) as this year’s invited speaker at ET4S. The title of her talk is “Eye Tracking in Mixed Reality and its Promises for Spatial Research”.

ET4S 2019 is going to take place 26 June 2019, 15:30 – 18:00, at the ETRA conference in Denver, Colorado, USA. Attendance of ET4S is included in the ETRA registration.


  • -

Meet us at CHI 2019

We’ll present one full paper and two workshop position papers at CHI in Glasgow this year:

Workshop: Designing for Outdoor Play (4th May, Saturday – 08:00 – 14:00, Room: Alsh 1)

Kiefer, P.(2019) Gaze-guided narratives for location-based games. In CHI 2019 Workshop on “Designing for Outdoor Play”, Glasgow, U.K., DOI: https://doi.org/10.3929/ethz-b-000337913

Workshop: Challenges Using Head-Mounted Displays in Shared and Social Spaces (5th May, Sunday – 08:00 – 14:00, Room: Alsh 2)

Göbel, F., Kwok, T.C.K., and Rudi, D.(2019) Look There! Be Social and Share. In CHI 2019 Workshop on “Challenges Using Head-Mounted Displays in Shared and Social Spaces”, Glasgow, U.K., DOI: https://doi.org/10.3929/ethz-b-000331280

Paper Session: Audio Experiences(8th May, Wednesday – 14:00 – 15:20, Room: Alsh 1)

Kwok, T.C.K., Kiefer, P., Schinazi, V.R., Adams, B., and Raubal, M. (2019) Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Maz 4-9, Glasgow, U.K. [PDF]

We are looking forward to seeing you in Glasgow!
These researches are part of the LAMETTA or IGAMaps projects.


  • -

  • -

ETIZ Meeting March 2019 – Impressions

More than 30 participants attended the meeting of the Eye Tracking Interest Group Zurich (ETIZ) hosted by us on 26 March 2019. Our invited speaker Andreas Bulling (University of Stuttgart) provided insights into his current and past research on pervasive eye tracking. Tiffany Kwok (GeoGazeLab, LAMETTA project) presented her PhD research on gaze-guided narratives. In an interactive mini-workshop, moderated by Arzu Çöltekin (FHNW), attendees brainstormed about challenges of eye tracking in VR and AR displays. Discussions were continued during an apéro, and many took the opportunity to try out a gaze-adaptive map demo (Fabian Göbel, GeoGazeLab, IGAMaps project).


  • -

FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps

An article titled “FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps

Abstract Map reading is a visual task that can strongly vary between individuals and maps of different characteristics. Aspects such as where, when, how long, and in which sequence information on a map is looked at can reveal valuable insights for both the map design process and to better understand cognitive processes of the map user. Contrary to static maps, for which many eye tracking studies are reported in the literature, established methods for tracking and analyzing visual attention on interactive maps are yet missing. In this paper, we present a framework called FeaturEyeTrack that allows to automatically log the cartographic features that have been inspected as well as the mouse input during the interaction with digital interactive maps. In particular, the novelty of FeaturEyeTrack lies in matching of gaze with the vector model of the current map visualization, therefore enabling a very detailed analysis without the requirement for manual annotation. Furthermore, we demonstrate the benefits of this approach in terms of manual work, level of detail and validity compared to state-of-the-art methods through a case study on an interactive cartographic web map.