• eyetracking@ethz.ch
  • +41 44 633 71 59

Category Archives: Aviation

  • -

D-CEET project launched

We are excited to announce the start of D-CEET, a new project with Lufthansa Aviation Training (LAT), funded by the German Federal Ministry for Digital and Transport (BMDV) in the scope of the mFund innovation initiative.

Today, training for airline crew members takes place almost exclusively in elaborately reproduced cabin dummies and simulators, so-called CEETs (Cabin Emergency Evacuation Trainers). The aim of the D-CEET project is to fully replicate an Airbus A320 CEET as a “digital twin”. The resulting data model is intended to enable fully immersive training of all relevant training content in virtual reality (VR) and additionally as a tablet-based application.

In our sub-project, we will validate the effectiveness of the new training concept using eye tracking and other physiological sensors in systematic studies with crew members. We aim at validating the achievement of competence objectives by measuring and observing behavior as well as by measuring situational awareness and cognitive workload.

Read more about the D-CEET project in a recent press release by LAT, on our website, or on the BMDV website (German only).


Image: Lufthansa Aviation Training GmbH


  • -

Shupeng Wang joins the team

We cordially welcome Shupeng Wang, who has started as a doctoral student in the geoGAZElab. Shupeng holds a Master’s degree in Geomatics from ETH Zürich and a Bachelor’s degree in Geography with an emphasis on Geographic Information Science from the University of California, Santa Barbara, USA. He will be part of the D-CEET project, focusing on the evaluation of new training concepts for airline crew members in a Digital Cabin Emergency Evacuation Trainer.


  • 0

Presentation & Publication on Aeronautical Charts at AGILE 2023

It was a great pleasure to attend this year’s AGILE conference on Geographic Information Science in Delft (The Netherlands), where Adrian Sarbach presented the results of our research on visualisation and perception of airspace structures on aeronautical charts.

Our paper contains a theoretical cartographic analysis on aeronautical charts used for flights following visual flight rules, and the results of a user study, which confirmed the findings from the theoretical analysis.

This project was conducted together with Thierry Weber, Katharina Henggeler, Luis Lutnyk, and Martin Raubal.

If you are interested in reading the full open access paper, titled “Evaluating and Comparing Airspace Structure Visualisation and Perception on Digital Aeronautical Charts“, you can find it here: https://doi.org/10.5194/agile-giss-4-12-2023 .

Adrian Sarbach presenting his work on aeronautical charts at AGILE 2023

Adrian Sarbach presenting his work on aeronautical charts at AGILE 2023


  • -

PhD graduation Luis Lutnyk

We’re glad to announce that our colleague Luis Lutnyk has successfully defended his dissertation on “Pilot Decision-Making Support through Intelligent Cockpit Technologies”. Congratulations, Luis!


  • -

New article in Applied Ergonomics – The effect of flight phase on electrodermal activity and gaze behavior: A simulator study

Our article “The effect of flight phase on electrodermal activity and gaze behavior: A simulator study” has been accepted for publication in the journal Applied Ergonomics:

Luis Lutnyk, David Rudi, Victor R. Schinazi, Peter Kiefer, Martin Raubal (2022). The effect of flight phase on electrodermal activity and gaze behavior: A simulator study , Applied Ergonomics, Volume 109, DOI: 10.1016/j.apergo.2023.103989 .

Highlights:

  • Unobtrusive technologies were used to record electrodermal activity and gaze behavior in an instrument failure scenario.
  • Participants’ electrodermal activity increased significantly during high workload phases of the failure scenario.
  • AOI-based & non-AOI eye tracking metrics show significant differences when a secondary task needs to be solved during flight.
  • The observed measures show great potential for future cockpits that can provide assistance based on the sensed pilot state.

Abstract. Current advances in airplane cockpit design and layout are often driven by a need to improve the pilot’s awareness of the aircraft’s state. This involves an improvement in the flow of information from aircraft to pilot. However, providing the aircraft with information on the pilot’s state remains an open challenge. This work takes a first step towards determining the pilot’s state based on biosensor data. We conducted a simulator study to record participants’ electrodermal activity and gaze behavior, indicating pilot state changes during three distinct flight phases in an instrument failure scenario. The results show a significant difference in these psychophysiological measures between a phase of regular flight, the incident phase, and a phase with an additional troubleshooting task after the failure. The differences in the observed measures suggest great potential for a pilot-aware cockpit that can provide assistance based on the sensed pilot state.

The article has been published as Open Access and you can get the PDF here:
https://www.sciencedirect.com/science/article/pii/S0003687023000273

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Participation in COSIT 2022

We are excited to be part of the 15th International Conference on Spatial Information Theory (COSIT 2022) that is taking place on September 5-9, 2022, in Kobe, Japan.

Two of our lab members, Kevin Kim and Adrian Sarbach, will attend the conference (in person!) and present our latest work. We are looking forward to meeting other researchers and discussing exciting research!

More information: http://cosit2022.iniad.org

 


  • -

FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight

Our article “FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight” has been accepted for publication by the International Journal of Human–Computer Interaction (IJHCI):

Luis Lutnyk, David Rudi, Emanuel Meier, Peter Kiefer, Martin Raubal (2022). FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight , International Journal of Human–Computer Interaction, DOI: 10.1080/10447318.2022.2075627.

Abstract. Contemporary aircraft cockpits rely mostly on audiovisual information propagation which can overwhelm particularly novice pilots. The introduction of tactile feedback, as a less taxed modality, can improve the usability in this case. As part of a within-subject simulator study, 22 participants are asked to fly a visual-flight-rule scenario along a predefined route and identify objects in the outside world that serve as waypoints. Participants fly two similar scenarios with and without a tactile belt that indicates the route. Results show that with the belt, participants perform better in identifying objects, have higher usability and user experience ratings, and a lower perceived cognitive workload, while showing no improvement in spatial awareness. Moreover, 86\% of the participants state that they prefer flying with the tactile belt. These results suggest that a tactile belt provides pilots with an unobtrusive mode of assistance for tasks that require orientation using cues from the outside world.

The article will appear in one of the next issues of the International Journal of Human–Computer Interaction.

It has been published as Open Access and you can get the PDF here:
https://www.tandfonline.com/doi/full/10.1080/10447318.2022.2075627


  • -

Co-authored paper on PEGGASUS at SPIE Photonics West

A joint paper with our partners at CSEM on the eye tracking hard- and software developed in PEGGASUS was published on the SPIE digital library. We encourage you to take a deep dive into the technological achievements of PEGGASUS:

“The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes.”

Engin Türetkin, Sareh Saeedi, Siavash Bigdeli, Patrick Stadelmann, Nicolas Cantale, Luis Lutnyk, Martin Raubal, Andrea L. Dunbar (2022). Real time eye gaze tracking for human machine interaction in the cockpit In AI and Optical Data Sciences III (Vol. 12019, pp. 24-33). SPIE..

The paper was presented at SPIE’s Photonics West conference at San Francisco’s Moscone Center.

Find the full text and presentation video here: https://doi.org/10.1117/12.2607434

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Successful finish of PEGGASUS and summary

We are happy to report that our aviation project PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions) successfully finished.

We would like to thank all partners involved in the project for the extensive efforts to finish the project successfully and deliver the results despite all the Covid-related restrictions and hurdles.

You can find a summary of the project outcomes at the EU Cordis Portal:

“The PEGGASUS consortium has developed a multi-camera vision system for tracking eye-gaze and gestures of the pilots in real time in the cockpit environment. This system allows a leap towards a more comprehensive human-machine interface in the cockpit to reduce the stress and cognitive load of the pilots, while bringing forward future pilot training techniques. Better awareness of the instruments will help the flight management to optimize trajectories and better manage fuel use, in line with the overall objectives of the Clean Sky JU.”

Two images showing the results of the algorithms including pupil detection and eye gaze estimation

Ultimate Prototype Hardware setup installed in the cockpit simulator

 

 

 

 

 

 

 

 

 

Excerpt and images taken from: https://cordis.europa.eu/project/id/821461/reporting

This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Joint Talk at the Royal Aeronautical Society Flight Simulation Conference

We had the honor of giving a presentation at the Royal Aeronautical Society‘s Flight Simulation Conference which took place from October 26nd to 27th in London.

In the joint talk with Gilad Scherpf of Lufthansa Group, we presented results from the PEGGASUS project and showcased how eye and gesture tracking can support the assessment of EBT competencies. (Evidence-Based Training)

We want to thank the hosts at the RAeS for the invitation and the attendees for the very positive feedback and interesting questions during the panel discussion. A special thank you also goes out to our partners at SWISS and CSEM.

The full programme of the conference can be found here.

 

The talk was given as part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Adrian Sarbach joins the team

We’re glad that Adrian Sarbach has joined the geoGAZElab as a doctoral student on the project “The Expanded Flight Deck – Improving the Weather Situation Awareness of Pilots (EFDISA)“.

Adrian has, among others, studied at EPFL (Bachelor) and at ETH Zurich (Master), obtaining his degrees in electrical engineering. He wrote his MSc thesis in collaboration with Swiss International Air Lines, on the topic of tail assignment optimization.


  • -

New project and open position: EFDISA

We are excited that we receive funding from the Swiss Federal Office of Civil Aviation (BAZL) for a new project, starting in July 2021: “The Expanded Flight Deck – Improving the Weather Situation Awareness of Pilots (EFDISA)“. The project aims at improving contemporary pre-flight and in-flight representations of weather data for pilots. This will allow pilots to better perceive, understand, and anticipate meteorological hazards. The project will be done in close collaboration with industry partners and professional pilots (Swiss International Air Lines & Lufthansa Systems).

We are looking for a highly motivated doctoral student for this project. Applications are now open.


  • -

Workshop Paper published from ICCAS 2020

Our paper “Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning” has been published in the proceedings of the 1st International Conference on Cognitive Aircraft Systems:

Lutnyk, L., Rudi, D., Kiefer, P., & Raubal, M. (2020). Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning ICCAS 2020.

Abstract. Moving towards the highly controversial single pilot cockpit, more and more automation capabilities are added to today’s airliners. However, to operate safely without a pilot monitoring, avionics systems in future cockpits will have to be able to intelligently assist the remaining pilot. One critical enabler for proper assistance is a reliable classification of the pilot’s state, both in normal conditions and more critically in abnormal situations like an equipment failure. Only with a good assessment of the pilot’s state, the cockpit can adapt to the pilot’s current needs, i.e. alert, adapt displays, take over tasks, monitor procedures, etc.

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Workshop talk: Flight safety – Line of sight

Pilots not only have to make the right decisions, but they have to do it quickly and process a lot of information – especially visual information. In a unique project, ETH Zurich and Swiss International Air Lines have investigated what the eyes of pilots do in this process.

Martin Raubal, Professor of Geoinformation Engineering at ETH Zurich, appreciates the practical relevance of this research collaboration, which could contribute to increasing flight safety. Anyone who wants to develop it further should take off their blinders and think outside the box, says Christoph Ammann, captain and instructor at Swiss. And ETH Zurich is an ideal partner for this.

Watch the video on Vimeo!


  • -

New article on iAssyst

The instructor assistant system (iAssyst) that we developed as part of our research collaboration with Swiss International Air Lines is being featured in an article by innoFRAtor, the innovation portal of the Fraport AG.

You may read more about the system in our related research article: Rudi D., Kiefer P., and Raubal M. (2020). The Instructor Assistant System (iASSYST) – Utilizing Eye Tracking for Commercial Aviation Training Purposes. Ergonomics, vol. 63: no. 1, pp. 61-​79, London: Taylor & Francis, 2020. DOI: https://doi.org/10.1080/00140139.2019.1685132

Our project on Enhanced flight training program for monitoring aircraft automation with Swiss International Air Lines, NASA, and the University of Oregon was officially concluded end of last year.


  • -

Workshop Paper published from ETAVI 2020

Our paper Towards Pilot-Aware Cockpits has been published in the proceedings of the 1st International Workshop on Eye-Tracking in Aviation (ETAVI 2020):

Lutnyk L., Rudi D., and Raubal M. (2020). Towards Pilot-​Aware Cockpits. In Proceedings of the 1st International Workshop on Eye-​Tracking in Aviation (ETAVI 2020), ETH Zurich. DOI: https://doi.org/10.3929/ethz-b-000407661

Abstract. Eye tracking has a longstanding history in aviation research. Amongst others it has been employed to bring pilots back “in the loop”, i.e., create a better awareness of the flight situation. Interestingly, there exists only little research in this context that evaluates the application of machine learning algorithms to model pilots’ understanding of the aircraft’s state and their situation awareness. Machine learning models could be trained to differentiate between normal and abnormal patterns with regard to pilots’ eye movements, control inputs, and data from other psychophysiological sensors, such as heart rate or blood pressure. Moreover, when the system recognizes an abnormal pattern, it could provide situation specific assistance to bring pilots back in the loop. This paper discusses when pilots benefit from such a pilot-aware system, and explores the technical and user oriented requirements for implementing this system.

Edit. The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

ETAVI 2020 Proceedings Online

The proceedings of the 1st International Workshop on Eye-Tracking in Aviation (ETAVI) 2020 have been published here.

We thank all program committee members for their efforts and great support, it is very much appreciated.

Furthermore, we thank all authors of the 15 excellent articles that were accepted for ETAVI 2020. We regret that the event had to be cancelled due to COVID-19.

Edit. Some of the organizers from ETH are part of the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Final project event – Enhanced flight training program for monitoring aircraft automation

On the 25th of November we presented the final results of our project entitled: “Enhanced flight training program for monitoring aircraft automation“.

The project was partially funded by the swiss Federal Office of Civil Aviation (BAZL), was lead by SWISS International Airlines Ltd., and supported by Prof. Dr. Robert Mauro from the Department of Psychology (University of Oregon) and Dr. Immanuel Barshi from NASA Ames Research Center, Human Systems Integration Division (NASA).

As part of this very successful project we developed a system that provides instructors with more detailed insights concerning pilots’ attention during training flights, to specifically improve instructors’ assessment of pilot situation awareness.

Our project has recently been featured in different media. ETH News has published a throrough article on our project: Tracking the eye of the pilot and different articles have been referencing this publication. Additionally, there have been two Radio interviews with Prof. Dr. Martin Raubal at Radio Zürisee and SRF4 (both in German).

Moreover, there have been publications during the course of the project:

Rudi, D., Kiefer, P. & Raubal, M. (2019). The Instructor Assistant System (iASSYST) – Utilizing Eye Tracking for Aviation Training Purposes. Ergonomics. https://doi.org/10.1080/00140139.2019.1685132.

Rudi, D., Kiefer, P., Giannopoulos, I., & Raubal, M. (2019). Gaze-based interactions in the cockpit of the future – a survey. Journal on Multimodal User Interfaces. Springer. Retrieve from https://link.springer.com/article/10.1007/s12193-019-00309-8.

Rudi D., Kiefer P. & Raubal, M. (2018). Visualizing Pilot Eye Movements for Flight Instructors. In ETVIS ’18: 3rd Workshop on Eye Tracking and Visualization. ACM, New York, NY, USA, Article 7, 5 pages. http://doi.acm.org/10.1145/3205929.3205934. Best Paper Award Winner.


  • -

Full paper published – Ergonomics Journal

Our paper “The instructor assistant system (iASSYST) – utilizing eye tracking for commercial aviation training purposes” has been published in the Ergonomics Journal:

Rudi, D., Kiefer, P., & Raubal, M. (2019). The instructor assistant system (iASSYST)-utilizing eye tracking for commercial aviation training purposes. Ergonomics.

Abstract. This work investigates the potential of providing commercial aviation flight instructors with an eye tracking enhanced observation system to support the training process. During training, instructors must deal with many parallel tasks, such as operating the flight simulator, acting as air traffic controllers, observing the pilots and taking notes. This can cause instructors to miss relevant information that is crucial for debriefing the pilots. To support instructors, the instructor ASsistant SYSTem (iASSYST) was developed. It includes video, audio, simulator and eye tracking recordings. iASSYST was evaluated in a study involving 7 instructors. The results show that with iASSYST, instructors were able to support their observations of errors, find new errors, determine that some previously identified errors were not errors, and to reclassify the types of errors that they had originally identified. Instructors agreed that eye tracking can help identifying causes of pilot error.


  • -

PhD graduation David Rudi

David Rudi has successfully defended his doctoral thesis on 16 September (“Enhancing Spatial Awareness of Pilots in Commercial Aviation”). We cordially congratulate, and are happy that he’ll stay with us as a PostDoc starting from November!


  • -

Visit by the Vice President for Research and Corporate Relations

On 11 September, Prof. Dr. Detlef Günther, the Vice President for Research and Corporate Relations of ETH Zurich, has visited the D-BAUG department and informed himself about the exciting research activities of the different institutes.

Our institute was represented by Peter Kiefer, who summarized the research of the GeoGazeLab. The slides provide an overview on our research interests and current projects.

Edit. The presentation includes the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

ETAVI – 2nd Call for Papers

A quick reminder for the “1st International Workshop on Eye-Tracking in Aviation (ETAVI)” that is going to take place March 2019 in Toulouse, France.

The submission deadlines are:

  • Abstracts: 9th September 2019
  • Paper: 30th September 2019

Feel free to also forward the Call for Papers to any interested colleagues.

We look forward to seeing you there!

Edit. Some of the organizers from ETH are part of the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Gaze-based interactions in the cockpit of the future: a survey

An article titled “Gaze-based interactions in the cockpit of the future: a survey” will appear in one of the next issues of the Journal on Multimodal User Interfaces. It is now available online:

Abstract Flying an aircraft is a mentally demanding task where pilots must process a vast amount of visual, auditory and vestibular information. They have to control the aircraft by pulling, pushing and turning different knobs and levers, while knowing that mistakes in doing so can have fatal outcomes. Therefore, attempts to improve and optimize these interactions should not increase pilots’ mental workload. By utilizing pilots’ visual attention, gaze-based interactions provide an unobtrusive solution to this. This research is the first to actively involve pilots in the exploration of gaze-based interactions in the cockpit. By distributing a survey among 20 active commercial aviation pilots working for an internationally operating airline, the paper investigates pilots’ perception and needs concerning gaze-based interactions. The results build the foundation for future research, because they not only reflect pilots’ attitudes towards this novel technology, but also provide an overview of situations in which pilots need gaze-based interactions.


  • -

PEGGASUS in the news

Our research project PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions) has attracted quite a bit of coverage in the media.

See for yourself in this little press review:

Edit. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

1st International Workshop on Eye Tracking in Aviation: Call for Papers

We’re glad to announce the 1st International Workshop on Eye Tracking in Aviation (ETAVI), which will take place March 17, 2020 in Toulouse, France.

The workshop aims to bring together researchers and practitioners who have a common interest in using eye tracking in the aviation domain including, but not limited to the cockpit and air traffic control / management.

The keynote will be given by Leonardo Di Stasi an assistant professor at the University of Granada (Spain) with an extensive research background in the field of Aviation and Eye Tracking.

The Call for Papers is now available (Paper submission deadline: September 30, 2019; Abstract submission deadline: September 9, 2019).


  • -

Invited Talk at the Royal Aeronautical Society

On June 12th, our colleagues from Swiss International Air Lines Ltd. and we had the honor to present our project on “Enhanced flight training program for monitoring aircraft automation” at the Royal Aeronautical Society, and received a lot of positive feedback for our project.

This year, the Spring Conference of the Royal Aeronautical Society Flight Simulation Group was on “The Future Reality of Flight Simulation” and featured a number of very interesting talks (Programme).

We thank our hosts for having us and plan to visit next year’s event as well.


  • -

New aviation project: PEGGASUS

PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions)

We’re glad to announce the start of a new aviation project at the GeoGazeLab.

Check out our vision for pilot interactions in the cockpit of the future at the project page.

Edit. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Luis Lutnyk joins the team

We are happy to welcome Luis Lutnyk as a new PhD student in the GeoGazeLab! His research will be about eye tracking in aviation.

[Current Team]

Edit. Luis is part of the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

News Articles on “Awareness in Aviation” Project

Two news articles published about the “Awareness in Aviation” project (both in German).

One article, written by Dominik Haug for the Aeropers Rundschau, which can be found here: Eye-Tracking – das Auge im Blick.

Another article, written by Benjamin Weinmann for the Aargauer Zeitung, which can be found here.


  • -

Best Paper at ETVIS 2018

We are happy to announce, that our paper received the best paper award at ETVIS!

David Rudi, Peter Kiefer, and Martin Raubal. 2018. Visualizing Pilot Eye Movements for Flight Instructors. In ETVIS’18: 3rdWorkshop on Eye Tracking and Visualization

The paper is part of the Awareness in Aviation project.