• eyetracking@ethz.ch
  • +41 44 633 71 59

News

  • -

Eyes4ICU: 2 open positions in MSCA doctoral network

Exciting news! The geoGAZElab will be participating in the MSCA Doctoral Network “Eyes for Interaction, Communication, and Understanding  (Eyes4ICU)” as an Associated Partner, funded by the Swiss State Secretariat for Education, Research and Innovation.

Eyes4ICU explores novel forms of gaze-based interaction that rely on current psychological theories and findings, computational modelling, as well as expertise in highly promising application domains. Its approach to developing inclusive technology by tracing gaze interaction back to its cognitive and affective foundations results in better models to predict user behaviour. By integrating insights in application fields, gaze-based interaction can be employed in the wild.

In the scope of Eyes4ICU, 12 doctoral candidates (DC) will be working at 7 different host institutions across Europe. Out of these, 2 DCs will be hosted at the geoGAZElab of ETH Zurich (PI: Peter Kiefer). They will be working on the topics Gaze-supported Trip Recommendation (DC6), and Gaze-supported Travel Experience Logging (DC12) respectively.

We are looking for two highly motivated doctoral candidates, starting at the earliest possible date: Position announcement.

 


  • -

PhD graduation Tiffany C.K. Kwok

We congratulate Tiffany C.K. Kwok for successfully completing her doctoral thesis on “Designing Unobtrusive Gaze-Based Interactions: Applications to Audio-Guided Panorama Viewing”. The doctoral graduation has been approved by the Department conference in their last meeting. The research was performed in the scope of the LAMETTA project.

Tiffany is staying with us for a PostDoc, continuing her research in the geoGAZElab. It’s great having you in our team, Tiffany!


  • -

Winter School Updates

We’re looking forward to our Winter School, taking place in January 2023.

Exciting updates to the program are now included in the updated announcement:

We’re glad that Prof. Dr. Enkelejda Kasneci (Technical University Munich) will be opening the Winter School with a keynote titled “On opportunities and challenges of eye tracking and machine learning for adaptive educational interfaces and classroom research“.

We’d like to thank the following sponsors, whose generous support will enable us to support several young researchers with a travel grant:

Application for travel grants is open until 15 October 2022.

The Winter School is a great opportunity for getting trained on eye tracking methodology, experimental design, and analysis. At the same time, it will facilitate networking with speakers, sponsor representatives, as well as among participants.


  • -

Tianyi Xiao joins the team

Our team is growing further: we’re so happy to have Tianyi Xiao on board, who is joining the 3D Sketch Maps project as a doctoral student. Welcome!


  • -

Winter School 2023

We are co-organizing an ETH Winter School on “Eye Tracking – Experimental Design, Implementation, and Analysis” which is going to take place in Monte Verità (Ticino), Switzerland, from 8 to 13 January 2023. Download the first announcement as PDF.

The Winter School targets at PhD students and early PostDocs (coming from any research field) who are using, or planning to use, eye tracking in their research. Internationally recognized experts will provide lectures and hands-on sessions on eye tracking methodology, experimental design, and analysis.

The registration for the Winter School is now open.

Building on the successful 2016 Winter School, the 2023 School will be updated to focus on state-of-the-art software (licensed and open-source, e.g., PsychoPy and Pupil Labs) and hardware. Hands-on exercises will focus on table-mounted eye trackers.


  • -

Participation in COSIT 2022

We are excited to be part of the 15th International Conference on Spatial Information Theory (COSIT 2022) that is taking place on September 5-9, 2022, in Kobe, Japan.

Two of our lab members, Kevin Kim and Adrian Sarbach, will attend the conference (in person!) and present our latest work. We are looking forward to meeting other researchers and discussing exciting research!

More information: http://cosit2022.iniad.org

 


  • -

FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight

Our article “FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight” has been accepted for publication by the International Journal of Human–Computer Interaction (IJHCI):

Luis Lutnyk, David Rudi, Emanuel Meier, Peter Kiefer, Martin Raubal (2022). FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight , International Journal of Human–Computer Interaction, DOI: 10.1080/10447318.2022.2075627.

Abstract. Contemporary aircraft cockpits rely mostly on audiovisual information propagation which can overwhelm particularly novice pilots. The introduction of tactile feedback, as a less taxed modality, can improve the usability in this case. As part of a within-subject simulator study, 22 participants are asked to fly a visual-flight-rule scenario along a predefined route and identify objects in the outside world that serve as waypoints. Participants fly two similar scenarios with and without a tactile belt that indicates the route. Results show that with the belt, participants perform better in identifying objects, have higher usability and user experience ratings, and a lower perceived cognitive workload, while showing no improvement in spatial awareness. Moreover, 86\% of the participants state that they prefer flying with the tactile belt. These results suggest that a tactile belt provides pilots with an unobtrusive mode of assistance for tasks that require orientation using cues from the outside world.

The article will appear in one of the next issues of the International Journal of Human–Computer Interaction.

It has been published as Open Access and you can get the PDF here:
https://www.tandfonline.com/doi/full/10.1080/10447318.2022.2075627


  • -

Co-authored paper on PEGGASUS at SPIE Photonics West

A joint paper with our partners at CSEM on the eye tracking hard- and software developed in PEGGASUS was published on the SPIE digital library. We encourage you to take a deep dive into the technological achievements of PEGGASUS:

“The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes.”

Engin Türetkin, Sareh Saeedi, Siavash Bigdeli, Patrick Stadelmann, Nicolas Cantale, Luis Lutnyk, Martin Raubal, Andrea L. Dunbar (2022). Real time eye gaze tracking for human machine interaction in the cockpit In AI and Optical Data Sciences III (Vol. 12019, pp. 24-33). SPIE..

The paper was presented at SPIE’s Photonics West conference at San Francisco’s Moscone Center.

Find the full text and presentation video here: https://doi.org/10.1117/12.2607434

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Control Room Decision-Making Lab at Singapore-ETH Centre

We are happy to announce the starting of our new Control Room Decision-Making Lab at the Singapore-ETH Centre (SEC) for our project on Communicating Predicted Disruptions in Future Resilient Systems (FRS 2). The lab is equipped with sensors for measuring decision makers’ psycho-physiological state, including remote and head-mounted eye trackers, galvanic skin response sensors, and accelerometers. The new lab infrastructure will be used to study how different communication techniques can be used in control rooms to support decision-makers. This research will improve the next generation of control rooms, thus enhancing the resilience of the monitored infrastructure in case of disruptions

We want to thank all our collaborators and the SEC management for their help setting up the laboratory.

Control Room Decision-Making Lab


  • -

Successful finish of PEGGASUS and summary

We are happy to report that our aviation project PEGGASUS (Pilot Eye Gaze and Gesture tracking for Avionics Systems using Unobtrusive Solutions) successfully finished.

We would like to thank all partners involved in the project for the extensive efforts to finish the project successfully and deliver the results despite all the Covid-related restrictions and hurdles.

You can find a summary of the project outcomes at the EU Cordis Portal:

“The PEGGASUS consortium has developed a multi-camera vision system for tracking eye-gaze and gestures of the pilots in real time in the cockpit environment. This system allows a leap towards a more comprehensive human-machine interface in the cockpit to reduce the stress and cognitive load of the pilots, while bringing forward future pilot training techniques. Better awareness of the instruments will help the flight management to optimize trajectories and better manage fuel use, in line with the overall objectives of the Clean Sky JU.”

Two images showing the results of the algorithms including pupil detection and eye gaze estimation

Ultimate Prototype Hardware setup installed in the cockpit simulator

 

 

 

 

 

 

 

 

 

Excerpt and images taken from: https://cordis.europa.eu/project/id/821461/reporting

This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Kevin Gonyop Kim joins the team

We’re excited to welcome Kevin Gonyop Kim, who has now started as a postdoctoral researcher in the 3D sketch maps project. Welcome to the team, Kevin!


  • -

  • -

Joint Talk at the Royal Aeronautical Society Flight Simulation Conference

We had the honor of giving a presentation at the Royal Aeronautical Society‘s Flight Simulation Conference which took place from October 26nd to 27th in London.

In the joint talk with Gilad Scherpf of Lufthansa Group, we presented results from the PEGGASUS project and showcased how eye and gesture tracking can support the assessment of EBT competencies. (Evidence-Based Training)

We want to thank the hosts at the RAeS for the invitation and the attendees for the very positive feedback and interesting questions during the panel discussion. A special thank you also goes out to our partners at SWISS and CSEM.

The full programme of the conference can be found here.

 

The talk was given as part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

PhD graduation Fabian Göbel

Fabian Göbel has successfully completed his doctoral thesis on “Visual Attentive User Interfaces for Feature-Rich Environments”. The doctoral graduation has been approved by the Department conference in their last meeting. Congratulations, Fabian!

After his thesis defense, Fabian has started a research internship at Microsoft on the topic of interaction with HoloLens 2. We wish him all the best and thank him for all the contributions he has made to our research!


  • -

Workshop Paper presented at the INTERACT 2021

Our paper “Improving resilience by communicating predicted disruptions in control rooms” has been presented at the INTERACT 2021 Workshop on Control Rooms in Safety Critical Contexts (CRiSCC): Design, Engineering and Evaluation Issues. The full-day workshop was held in a hybrid manner at Bari, Italy, with 13 interdisciplinary researchers. The vision paper outlines some of the ideas and challenges which we are addressing in the FRS 2 project on “Optimizing strategies for communicating predicted disruptions to stakeholders”:

Chakraborty, S., Kiefer, P., & Raubal, M. (2021). Improving resilience by communicating predicted disruptions in control rooms, INTERACT 2021.

Abstract: Even though the importance of resilience for control rooms is generally acknowledged, cognitive resilience is often not taken into account properly during control room design. This vision paper aims at improving the cognitive resilience in control rooms through advancements in three key research areas: 1) automated detection of upcoming disruptions, 2) visualization of spatio-temporal uncertainty, 3) cognition-aware interaction design.


  • -

  • -

New project and open position: 3D Sketch Maps

We’re very much looking forward to the start of the 3D Sketch Maps project, for which we have now announced an open PhD position: Interaction with 3D Sketch Maps in Extended Reality.

The 3D Sketch Maps project, funded by the Swiss National Science Foundation in the scope of the Sinergia funding program, investigates 3D sketch maps from a theoretical, empirical, cognitive, as well as tool-​related perspective, with a particular focus on Extended Reality (XR) technologies. Sketch mapping is an established research method in fields that study human spatial decision-​making and information processing, such as navigation and wayfinding. Although space is naturally three-​dimensional (3D), contemporary research has focused on assessing individuals’ spatial knowledge with two-​dimensional (2D) sketches. For many domains though, such as aviation or the cognition of complex multilevel buildings, it is essential to study people’s 3D understanding of space, which is not possible with the current 2D methods. Eye tracking will be used for the analysis of people’s eye movements while using the sketch mapping tools.

The 4-​year project will be carried out jointly by the Chair of Geoinformation Engineering, the Chair of Cognitive Science at ETH Zurich (Prof. Dr. Christoph Hölscher), and the Spatial Intelligence Lab at University of Münster (Prof. Dr. Angela Schwering).

Interested? Please check out the open PhD position on the ETH job board!


  • -

Adrian Sarbach joins the team

We’re glad that Adrian Sarbach has joined the geoGAZElab as a doctoral student on the project “The Expanded Flight Deck – Improving the Weather Situation Awareness of Pilots (EFDISA)“.

Adrian has, among others, studied at EPFL (Bachelor) and at ETH Zurich (Master), obtaining his degrees in electrical engineering. He wrote his MSc thesis in collaboration with Swiss International Air Lines, on the topic of tail assignment optimization.


  • -

Full Paper presentation at ETRA 2021

Our accepted paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” will be presented at ACM ETRA 2021:

May 25.2021 at 11:00 – 12:00 and 18:00 – 19:00 in “Posters & Demos & Videos”
May 26.2021 at 14:4516.15 in Track 1: “Full Papers V”

Join the virtual conference for a chat!
https://etra.acm.org/2021/schedule.html


  • -

Kuno Kurzhals: Junior Research Group Lead

Kuno Kurzhals has left us for his new position as Junior Research Group Lead in the Cluster of Excellence Integrative Computational Design and Construction for Architecture (IntCDC) at the University of Stuttgart, Germany.

We wish him all the best and thank for the contributions he has made to the geoGAZElab during his PostDoc time!


  • -

New project and open position: EFDISA

We are excited that we receive funding from the Swiss Federal Office of Civil Aviation (BAZL) for a new project, starting in July 2021: “The Expanded Flight Deck – Improving the Weather Situation Awareness of Pilots (EFDISA)“. The project aims at improving contemporary pre-flight and in-flight representations of weather data for pilots. This will allow pilots to better perceive, understand, and anticipate meteorological hazards. The project will be done in close collaboration with industry partners and professional pilots (Swiss International Air Lines & Lufthansa Systems).

We are looking for a highly motivated doctoral student for this project. Applications are now open.


  • -

Results of the Interdisciplinary Project 2020

As an interdisciplinary project, the three Geomatics Master students Laura Schalbetter, Tianyu Wu and Xavier Brunner have developed an indoor navigation system for Microsoft HoloLens 2. The system was implemented using ESRI CityEngine, Unity, and Microsoft Visual Studio.

Check out their video:


  • -

Workshop Paper published from ICCAS 2020

Our paper “Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning” has been published in the proceedings of the 1st International Conference on Cognitive Aircraft Systems:

Lutnyk, L., Rudi, D., Kiefer, P., & Raubal, M. (2020). Recognizing Pilot State: Enabling Tailored In-Flight Assistance Through Machine Learning ICCAS 2020.

Abstract. Moving towards the highly controversial single pilot cockpit, more and more automation capabilities are added to today’s airliners. However, to operate safely without a pilot monitoring, avionics systems in future cockpits will have to be able to intelligently assist the remaining pilot. One critical enabler for proper assistance is a reliable classification of the pilot’s state, both in normal conditions and more critically in abnormal situations like an equipment failure. Only with a good assessment of the pilot’s state, the cockpit can adapt to the pilot’s current needs, i.e. alert, adapt displays, take over tasks, monitor procedures, etc.

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Suvodip Chakraborty starting in January

After a Corona-caused delay in the hiring process, we’re excited to announce that Suvodip Chakraborty will start as a PhD student in our Singapore-based project on Communicating Predicted Disruptions in the scope of the Future Resilient Systems 2 research program. Suvodip will start in January 2021.

Suvodip holds a Master of Science from the Indian Institute of Technology Kharagpur. His Master thesis was titled “Design of Electro-oculography based wearable systems for eye movement analysis”.


  • -

ETRA 2021 Call for Demos&Videos

As last year, we are involved in the organization of ETRA 2021, the ACM Symposium on Eye Tracking Research & Applications. Peter is again co-chairing the Demo&Video Track, for which the Call is now available online. The submission deadline is 2 February 2021.


  • -

Workshop talk: Flight safety – Line of sight

Pilots not only have to make the right decisions, but they have to do it quickly and process a lot of information – especially visual information. In a unique project, ETH Zurich and Swiss International Air Lines have investigated what the eyes of pilots do in this process.

Martin Raubal, Professor of Geoinformation Engineering at ETH Zurich, appreciates the practical relevance of this research collaboration, which could contribute to increasing flight safety. Anyone who wants to develop it further should take off their blinders and think outside the box, says Christoph Ammann, captain and instructor at Swiss. And ETH Zurich is an ideal partner for this.

Watch the video on Vimeo!


  • -

  • -

Book Chapter on Outdoor HCI accepted

Kiefer, P., Adams, B., Kwok, T., Raubal, M. (2020) Modeling Gaze-Guided Narratives for Outdoor Tourism. In: McCrickard, S., Jones, M., and Stelter, T. (eds.): HCI Outdoors: Theory, Design, Methods and Applications. Springer International Publishing (in print)


  • -

ET4S and ETVIS proceedings

We’ve been involved in the organization of two co-located events at this year’s ETRA conference: Eye Tracking for Spatial Research (ET4S) and Eye Tracking and Visualization (ETVIS). Even though ETRA and all co-located events had to be canceled, the review process was finished regularly, and accepted papers are now available in the ETRA Adjunct Proceedings in the ACM Digital Library.

Accepted ET4S papers are also linked from the ET4S website.


  • -

New article on iAssyst

The instructor assistant system (iAssyst) that we developed as part of our research collaboration with Swiss International Air Lines is being featured in an article by innoFRAtor, the innovation portal of the Fraport AG.

You may read more about the system in our related research article: Rudi D., Kiefer P., and Raubal M. (2020). The Instructor Assistant System (iASSYST) – Utilizing Eye Tracking for Commercial Aviation Training Purposes. Ergonomics, vol. 63: no. 1, pp. 61-​79, London: Taylor & Francis, 2020. DOI: https://doi.org/10.1080/00140139.2019.1685132

Our project on Enhanced flight training program for monitoring aircraft automation with Swiss International Air Lines, NASA, and the University of Oregon was officially concluded end of last year.