• eyetracking@ethz.ch
  • +41 44 633 71 59

News

  • -

Welcome, Lin Che and Yiwei Wang!

We’re happy that two new doctoral students have joined our team, Lin Che and Yiwei Wang. They’re both part of the MSCA doctoral network “Eyes for Interaction, Communication, and Understanding (Eyes4ICU)”. Lin will be working on
Gaze-supported Trip Recommendation (DC6), Yiwei on Gaze-supported Travel Experience Logging (DC12). Welcome!

 


  • -

New article in Applied Ergonomics – The effect of flight phase on electrodermal activity and gaze behavior: A simulator study

Our article “The effect of flight phase on electrodermal activity and gaze behavior: A simulator study” has been accepted for publication in the journal Applied Ergonomics:

Luis Lutnyk, David Rudi, Victor R. Schinazi, Peter Kiefer, Martin Raubal (2022). The effect of flight phase on electrodermal activity and gaze behavior: A simulator study , Applied Ergonomics, Volume 109, DOI: 10.1016/j.apergo.2023.103989 .

Highlights:

  • Unobtrusive technologies were used to record electrodermal activity and gaze behavior in an instrument failure scenario.
  • Participants’ electrodermal activity increased significantly during high workload phases of the failure scenario.
  • AOI-based & non-AOI eye tracking metrics show significant differences when a secondary task needs to be solved during flight.
  • The observed measures show great potential for future cockpits that can provide assistance based on the sensed pilot state.

Abstract. Current advances in airplane cockpit design and layout are often driven by a need to improve the pilot’s awareness of the aircraft’s state. This involves an improvement in the flow of information from aircraft to pilot. However, providing the aircraft with information on the pilot’s state remains an open challenge. This work takes a first step towards determining the pilot’s state based on biosensor data. We conducted a simulator study to record participants’ electrodermal activity and gaze behavior, indicating pilot state changes during three distinct flight phases in an instrument failure scenario. The results show a significant difference in these psychophysiological measures between a phase of regular flight, the incident phase, and a phase with an additional troubleshooting task after the failure. The differences in the observed measures suggest great potential for a pilot-aware cockpit that can provide assistance based on the sensed pilot state.

The article has been published as Open Access and you can get the PDF here:
https://www.sciencedirect.com/science/article/pii/S0003687023000273

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

New Article Published in Human-Computer Interaction

Our article “Unobtrusive interaction: a systematic literature review and expert survey” has been accepted for publication by the Human–Computer Interaction (HCI):

Tiffany C.K. Kwok, Peter Kiefer & Martin Raubal (2023). Unobtrusive interaction: a systematic literature review and expert survey, Human–Computer Interaction, DOI: 10.1080/07370024.2022.2162404

Abstract. Unobtrusiveness has been highlighted as an important design principle in Human-Computer Interaction (HCI). However, the understanding of unobtrusiveness in the literature varies. Researchers often claim unobtrusiveness for their interaction method based on their understanding of what unobtrusiveness means. This lack of a shared definition hinders effective communication in research and impedes comparability between approaches. In this article, we approach the question “What is unobtrusive interaction?” with a systematic and extensive literature review of 335 papers and an online survey with experts. We found that not a single definition of unobtrusiveness is universally agreed upon. Instead, we identify five working definitions from the literature and experts’ responses. We summarize the properties of unobtrusive interaction into a design framework with five dimensions and classify the reviewed papers with regard to these dimensions. The article aims to provide researchers with a more unified context to compare their work and identify opportunities for future research.

The article will appear in one of the next issues of the Human–Computer Interaction. It has been published as Open Access and you can get the article here:
https://doi.org/10.1080/07370024.2022.2162404


  • -

Winter School 2023: Summary

What an exciting way of starting into the new year!

Our Winter School on “Eye Tracking – Experimental Design, Implementation, and Analysis” took place in the second week of January on Monte Verità in Ascona, Switzerland. A total of 36 participants attended, with a large variety in terms of research background.

With her virtual keynote, Enkelejda Kasneci (TU Munich, Germany) opened the Winter School, in which she presented her research “On opportunities and challenges of eye tracking and machine learning for adaptive educational interfaces and classroom research”. Over the five days of the Winter School, participants learned about the different steps involved in performing eye tracking experiments, starting from experimental design, over data collection and processing, to statistical analysis (speakers: Nina Gehrer, University of Tübingen, Germany; Andrew Duchowski, Clemson University, S.C., US; Izabela and Krzysztof Krejtz, SWPS University of Social Sciences and Humanities, Poland). In hands-on sessions, participants designed and performed their own small eye tracking experiments.

The Winter School also enabled exchange between participants, through group work, poster presentations, and an excursion. The atmosphere of Monte Verità offered the perfect atmosphere and surroundings for this.

Thanks to all who have made this possible, especially our speakers and all sponsors!

 

 


  • -

Full Paper published at ICMI 2022

Our paper “Two-Step Gaze Guidance” has been published in the proceedings of the International Conference on Multimodal Interaction (ICMI ’22) as a full paper.

Tiffany C.K. Kwok, Peter Kiefer, Martin Raubal (2022). Two-Step Gaze Guidance, International Conference on Multimodal Interaction (ICMI ’22), DOI: 10.1145/3536221.3556612

Abstract. One challenge of providing guidance for search tasks consists in guiding the user’s visual attention to certain objects in a potentially large search space. Previous work has tried to guide the user’s attention by providing visual, audio, or haptic cues. The state-of-the-art methods either provide hints pointing towards the approximate direction of the target location for a fast but less accurate search or require the user to perform a fine-grained search from the beginning for a precise yet less efficient search. To combine the advantage of both methods, we propose an interaction concept called Two-Step Gaze Guidance. The first-step guidance focuses on quick guidance toward the approximate direction, and the second-step guidance focuses on fine-grained guidance toward the exact location of the target. A between-subject study (N = 69) with five conditions was carried out to compare the two-step gaze guidance method with the single-step gaze guidance method. Results revealed that the proposed method outperformed the single-step gaze guidance method. More precisely, the introduction of Two-Step Gaze Guidance slightly improves the searching accuracy, and the use of spatial audio as the first-step guidance significantly helps in enhancing the searching efficiency. Our results also indicated several design suggestions for designing gaze guidance methods.


  • -

Eyes4ICU: 2 open positions in MSCA doctoral network

Exciting news! The geoGAZElab will be participating in the MSCA Doctoral Network “Eyes for Interaction, Communication, and Understanding  (Eyes4ICU)” as an Associated Partner, funded by the Swiss State Secretariat for Education, Research and Innovation.

Eyes4ICU explores novel forms of gaze-based interaction that rely on current psychological theories and findings, computational modelling, as well as expertise in highly promising application domains. Its approach to developing inclusive technology by tracing gaze interaction back to its cognitive and affective foundations results in better models to predict user behaviour. By integrating insights in application fields, gaze-based interaction can be employed in the wild.

In the scope of Eyes4ICU, 12 doctoral candidates (DC) will be working at 7 different host institutions across Europe. Out of these, 2 DCs will be hosted at the geoGAZElab of ETH Zurich (PI: Peter Kiefer). They will be working on the topics Gaze-supported Trip Recommendation (DC6), and Gaze-supported Travel Experience Logging (DC12) respectively.

We are looking for two highly motivated doctoral candidates, starting at the earliest possible date: Position announcement.

 


  • -

PhD graduation Tiffany C.K. Kwok

We congratulate Tiffany C.K. Kwok for successfully completing her doctoral thesis on “Designing Unobtrusive Gaze-Based Interactions: Applications to Audio-Guided Panorama Viewing”. The doctoral graduation has been approved by the Department conference in their last meeting. The research was performed in the scope of the LAMETTA project.

Tiffany is staying with us for a PostDoc, continuing her research in the geoGAZElab. It’s great having you in our team, Tiffany!


  • -

Winter School Updates

We’re looking forward to our Winter School, taking place in January 2023.

Exciting updates to the program are now included in the updated announcement:

We’re glad that Prof. Dr. Enkelejda Kasneci (Technical University Munich) will be opening the Winter School with a keynote titled “On opportunities and challenges of eye tracking and machine learning for adaptive educational interfaces and classroom research“.

We’d like to thank the following sponsors, whose generous support will enable us to support several young researchers with a travel grant:

Application for travel grants is open until 15 October 2022.

The Winter School is a great opportunity for getting trained on eye tracking methodology, experimental design, and analysis. At the same time, it will facilitate networking with speakers, sponsor representatives, as well as among participants.


  • -

Tianyi Xiao joins the team

Our team is growing further: we’re so happy to have Tianyi Xiao on board, who is joining the 3D Sketch Maps project as a doctoral student. Welcome!


  • -

Winter School 2023

We are co-organizing an ETH Winter School on “Eye Tracking – Experimental Design, Implementation, and Analysis” which is going to take place in Monte Verità (Ticino), Switzerland, from 8 to 13 January 2023. Download the first announcement as PDF.

The Winter School targets at PhD students and early PostDocs (coming from any research field) who are using, or planning to use, eye tracking in their research. Internationally recognized experts will provide lectures and hands-on sessions on eye tracking methodology, experimental design, and analysis.

The registration for the Winter School is now open.

Building on the successful 2016 Winter School, the 2023 School will be updated to focus on state-of-the-art software (licensed and open-source, e.g., PsychoPy and Pupil Labs) and hardware. Hands-on exercises will focus on table-mounted eye trackers.


  • -

Participation in COSIT 2022

We are excited to be part of the 15th International Conference on Spatial Information Theory (COSIT 2022) that is taking place on September 5-9, 2022, in Kobe, Japan.

Two of our lab members, Kevin Kim and Adrian Sarbach, will attend the conference (in person!) and present our latest work. We are looking forward to meeting other researchers and discussing exciting research!

More information: http://cosit2022.iniad.org

 


  • -

FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight

Our article “FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight” has been accepted for publication by the International Journal of Human–Computer Interaction (IJHCI):

Luis Lutnyk, David Rudi, Emanuel Meier, Peter Kiefer, Martin Raubal (2022). FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight , International Journal of Human–Computer Interaction, DOI: 10.1080/10447318.2022.2075627.

Abstract. Contemporary aircraft cockpits rely mostly on audiovisual information propagation which can overwhelm particularly novice pilots. The introduction of tactile feedback, as a less taxed modality, can improve the usability in this case. As part of a within-subject simulator study, 22 participants are asked to fly a visual-flight-rule scenario along a predefined route and identify objects in the outside world that serve as waypoints. Participants fly two similar scenarios with and without a tactile belt that indicates the route. Results show that with the belt, participants perform better in identifying objects, have higher usability and user experience ratings, and a lower perceived cognitive workload, while showing no improvement in spatial awareness. Moreover, 86\% of the participants state that they prefer flying with the tactile belt. These results suggest that a tactile belt provides pilots with an unobtrusive mode of assistance for tasks that require orientation using cues from the outside world.

The article will appear in one of the next issues of the International Journal of Human–Computer Interaction.

It has been published as Open Access and you can get the PDF here:
https://www.tandfonline.com/doi/full/10.1080/10447318.2022.2075627


  • -

Co-authored paper on PEGGASUS at SPIE Photonics West

A joint paper with our partners at CSEM on the eye tracking hard- and software developed in PEGGASUS was published on the SPIE digital library. We encourage you to take a deep dive into the technological achievements of PEGGASUS:

“The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes.”

Engin Türetkin, Sareh Saeedi, Siavash Bigdeli, Patrick Stadelmann, Nicolas Cantale, Luis Lutnyk, Martin Raubal, Andrea L. Dunbar (2022). Real time eye gaze tracking for human machine interaction in the cockpit In AI and Optical Data Sciences III (Vol. 12019, pp. 24-33). SPIE..

The paper was presented at SPIE’s Photonics West conference at San Francisco’s Moscone Center.

Find the full text and presentation video here: https://doi.org/10.1117/12.2607434

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Control Room Decision-Making Lab at Singapore-ETH Centre

We are happy to announce the starting of our new Control Room Decision-Making Lab at the Singapore-ETH Centre (SEC) for our project on Communicating Predicted Disruptions in Future Resilient Systems (FRS 2). The lab is equipped with sensors for measuring decision makers’ psycho-physiological state, including remote and head-mounted eye trackers, galvanic skin response sensors, and accelerometers. The new lab infrastructure will be used to study how different communication techniques can be used in control rooms to support decision-makers. This research will improve the next generation of control rooms, thus enhancing the resilience of the monitored infrastructure in case of disruptions

We want to thank all our collaborators and the SEC management for their help setting up the laboratory.

Control Room Decision-Making Lab


  • -

Successful finish of PEGGASUS and summary

We are happy to report that our aviation project PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions) successfully finished.

We would like to thank all partners involved in the project for the extensive efforts to finish the project successfully and deliver the results despite all the Covid-related restrictions and hurdles.

You can find a summary of the project outcomes at the EU Cordis Portal:

“The PEGGASUS consortium has developed a multi-camera vision system for tracking eye-gaze and gestures of the pilots in real time in the cockpit environment. This system allows a leap towards a more comprehensive human-machine interface in the cockpit to reduce the stress and cognitive load of the pilots, while bringing forward future pilot training techniques. Better awareness of the instruments will help the flight management to optimize trajectories and better manage fuel use, in line with the overall objectives of the Clean Sky JU.”

Two images showing the results of the algorithms including pupil detection and eye gaze estimation

Ultimate Prototype Hardware setup installed in the cockpit simulator

 

 

 

 

 

 

 

 

 

Excerpt and images taken from: https://cordis.europa.eu/project/id/821461/reporting

This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Kevin Gonyop Kim joins the team

We’re excited to welcome Kevin Gonyop Kim, who has now started as a postdoctoral researcher in the 3D sketch maps project. Welcome to the team, Kevin!


  • -

  • -

Joint Talk at the Royal Aeronautical Society Flight Simulation Conference

We had the honor of giving a presentation at the Royal Aeronautical Society‘s Flight Simulation Conference which took place from October 26nd to 27th in London.

In the joint talk with Gilad Scherpf of Lufthansa Group, we presented results from the PEGGASUS project and showcased how eye and gesture tracking can support the assessment of EBT competencies. (Evidence-Based Training)

We want to thank the hosts at the RAeS for the invitation and the attendees for the very positive feedback and interesting questions during the panel discussion. A special thank you also goes out to our partners at SWISS and CSEM.

The full programme of the conference can be found here.

 

The talk was given as part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

PhD graduation Fabian Göbel

Fabian Göbel has successfully completed his doctoral thesis on “Visual Attentive User Interfaces for Feature-Rich Environments”. The doctoral graduation has been approved by the Department conference in their last meeting. Congratulations, Fabian!

After his thesis defense, Fabian has started a research internship at Microsoft on the topic of interaction with HoloLens 2. We wish him all the best and thank him for all the contributions he has made to our research!


  • -

Workshop Paper presented at the INTERACT 2021

Our paper “Improving resilience by communicating predicted disruptions in control rooms” has been presented at the INTERACT 2021 Workshop on Control Rooms in Safety Critical Contexts (CRiSCC): Design, Engineering and Evaluation Issues. The full-day workshop was held in a hybrid manner at Bari, Italy, with 13 interdisciplinary researchers. The vision paper outlines some of the ideas and challenges which we are addressing in the FRS 2 project on “Optimizing strategies for communicating predicted disruptions to stakeholders”:

Chakraborty, S., Kiefer, P., & Raubal, M. (2021). Improving resilience by communicating predicted disruptions in control rooms, INTERACT 2021.

Abstract: Even though the importance of resilience for control rooms is generally acknowledged, cognitive resilience is often not taken into account properly during control room design. This vision paper aims at improving the cognitive resilience in control rooms through advancements in three key research areas: 1) automated detection of upcoming disruptions, 2) visualization of spatio-temporal uncertainty, 3) cognition-aware interaction design.


  • -

  • -

New project and open position: 3D Sketch Maps

We’re very much looking forward to the start of the 3D Sketch Maps project, for which we have now announced an open PhD position: Interaction with 3D Sketch Maps in Extended Reality.

The 3D Sketch Maps project, funded by the Swiss National Science Foundation in the scope of the Sinergia funding program, investigates 3D sketch maps from a theoretical, empirical, cognitive, as well as tool-​related perspective, with a particular focus on Extended Reality (XR) technologies. Sketch mapping is an established research method in fields that study human spatial decision-​making and information processing, such as navigation and wayfinding. Although space is naturally three-​dimensional (3D), contemporary research has focused on assessing individuals’ spatial knowledge with two-​dimensional (2D) sketches. For many domains though, such as aviation or the cognition of complex multilevel buildings, it is essential to study people’s 3D understanding of space, which is not possible with the current 2D methods. Eye tracking will be used for the analysis of people’s eye movements while using the sketch mapping tools.

The 4-​year project will be carried out jointly by the Chair of Geoinformation Engineering, the Chair of Cognitive Science at ETH Zurich (Prof. Dr. Christoph Hölscher), and the Spatial Intelligence Lab at University of Münster (Prof. Dr. Angela Schwering).

Interested? Please check out the open PhD position on the ETH job board!


  • -

Adrian Sarbach joins the team

We’re glad that Adrian Sarbach has joined the geoGAZElab as a doctoral student on the project “The Expanded Flight Deck – Improving the Weather Situation Awareness of Pilots (EFDISA)“.

Adrian has, among others, studied at EPFL (Bachelor) and at ETH Zurich (Master), obtaining his degrees in electrical engineering. He wrote his MSc thesis in collaboration with Swiss International Air Lines, on the topic of tail assignment optimization.


  • -

Full Paper presentation at ETRA 2021

Our accepted paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” will be presented at ACM ETRA 2021:

May 25.2021 at 11:00 – 12:00 and 18:00 – 19:00 in “Posters & Demos & Videos”
May 26.2021 at 14:4516.15 in Track 1: “Full Papers V”

Join the virtual conference for a chat!
https://etra.acm.org/2021/schedule.html


  • -

Kuno Kurzhals: Junior Research Group Lead

Kuno Kurzhals has left us for his new position as Junior Research Group Lead in the Cluster of Excellence Integrative Computational Design and Construction for Architecture (IntCDC) at the University of Stuttgart, Germany.

We wish him all the best and thank for the contributions he has made to the geoGAZElab during his PostDoc time!


  • -

New project and open position: EFDISA

We are excited that we receive funding from the Swiss Federal Office of Civil Aviation (BAZL) for a new project, starting in July 2021: “The Expanded Flight Deck – Improving the Weather Situation Awareness of Pilots (EFDISA)“. The project aims at improving contemporary pre-flight and in-flight representations of weather data for pilots. This will allow pilots to better perceive, understand, and anticipate meteorological hazards. The project will be done in close collaboration with industry partners and professional pilots (Swiss International Air Lines & Lufthansa Systems).

We are looking for a highly motivated doctoral student for this project. Applications are now open.


  • -

Results of the Interdisciplinary Project 2020

As an interdisciplinary project, the three Geomatics Master students Laura Schalbetter, Tianyu Wu and Xavier Brunner have developed an indoor navigation system for Microsoft HoloLens 2. The system was implemented using ESRI CityEngine, Unity, and Microsoft Visual Studio.

Check out their video:


  • -

Workshop Paper published from ICCAS 2020

Our paper “Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning” has been published in the proceedings of the 1st International Conference on Cognitive Aircraft Systems:

Lutnyk, L., Rudi, D., Kiefer, P., & Raubal, M. (2020). Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning ICCAS 2020.

Abstract. Moving towards the highly controversial single pilot cockpit, more and more automation capabilities are added to today’s airliners. However, to operate safely without a pilot monitoring, avionics systems in future cockpits will have to be able to intelligently assist the remaining pilot. One critical enabler for proper assistance is a reliable classification of the pilot’s state, both in normal conditions and more critically in abnormal situations like an equipment failure. Only with a good assessment of the pilot’s state, the cockpit can adapt to the pilot’s current needs, i.e. alert, adapt displays, take over tasks, monitor procedures, etc.

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Suvodip Chakraborty starting in January

After a Corona-caused delay in the hiring process, we’re excited to announce that Suvodip Chakraborty will start as a PhD student in our Singapore-based project on Communicating Predicted Disruptions in the scope of the Future Resilient Systems 2 research program. Suvodip will start in January 2021.

Suvodip holds a Master of Science from the Indian Institute of Technology Kharagpur. His Master thesis was titled “Design of Electro-oculography based wearable systems for eye movement analysis”.


  • -

ETRA 2021 Call for Demos&Videos

As last year, we are involved in the organization of ETRA 2021, the ACM Symposium on Eye Tracking Research & Applications. Peter is again co-chairing the Demo&Video Track, for which the Call is now available online. The submission deadline is 2 February 2021.