• eyetracking@ethz.ch
  • +41 44 633 71 59

Category Archives: Publication

  • -

Article in Scientific Reports: Mind Wandering

Do you enjoy touristic audio guides? You don’t? One reason could be that classic audio guides do not adapt to your cognitive state in real time.

In our most recent article in Scientific Reports, we investigate the detection of mind wandering from eye movements for combined audio-visual stimuli, such as touristic audio guides. We trained and validated classifiers which are able to successfully detect mind wandering in a 1-s time window. Future audio guides could use our results for providing more intelligent assistance.

Kwok, T. C. K., Kiefer, P., Schinazi, V. R., Hoelscher, C., & Raubal, M. (2024). Gaze-based detection of mind wandering during audio-guided panorama viewing. Scientific Reports, 14(1), 27955. https://doi.org/10.1038/s41598-024-79172-x

We wish you a fully focused (= non-mind wandering) reading experience!

This research is part of Tiffany Kwok’s (see former group members) doctoral thesis and an outcome of the LAMETTA project, which has been funded through an ETH Zurich Research Grant.


  • -

New Publication at COSIT 2024: Exploring Perceived Feasibility and Use Cases of 3D Sketch Mapping

Can you sketch in 3D?

Within the 3D Sketch Maps project, our former colleague, Prof. Kevin Gonyop Kim, and the geoGAZElab team investigated the perceived feasibility of 3D sketching in VR with a particular focus on the sketch mapping purpose. We interviewed 27 people from 3 domains and discovered that 3D sketching can help represent 3D information more effectively and complement the 2D approach. The paper was presented at the COSIT 2024 conference in September in Quebec City, Canada, by Prof. Martin Raubal and received attention and various questions. The full paper can be found here: https://doi.org/10.4230/LIPIcs.COSIT.2024.3.


  • -

Estimating Perceived Mental Workload From Eye-Tracking Data Based on Benign Anisocoria published in IEEE Transactions on Human-Machine Systems

We have developed two novel eye-tracking metrics to estimate perceived cognitive load based on benign anisocoria (difference between left and right pupil diameter). Our metrics outperform traditional pupil-based methods, providing a more accurate and reliable way to measure mental workload. Check out the early access here in IEEE Transactions on Human-Machine Systems


  • -

ETRA 2025: Call for Papers

With Peter Kiefer serving as one of the Full Paper Chairs, we’re again involved in the organization of the next ETRA conference, taking place in Tokyo, Japan, 26-29 May 2025.

Please check out the Call for Papers!


  • -

New publication in the International Journal of Human-Computer Studies

The 3D Sketch Maps project team recently published a paper entitled:  VResin: Externalizing spatial memory into 3D sketch map. In this paper, we refined the concept of 3D sketch maps, proposed a technological framework using 3D Cartesian coordinate axes, and provided design criteria for 3D sketch mapping interfaces. We implemented a VR-based 3D sketch mapping tool called VResin, which is associated with traditional resin painting, and a layer-by-layer sketching interface that considers both researcher and user needs. We then conducted a comparative user study with 48 participants between mid-air sketching and layer-by-layer sketching interfaces for memorising multi-layered buildings. We found that VResin helps users to create less distorted sketches while maintaining the level of completeness and generalisation compared to mid-air sketching in VR. Finally, we presented application scenarios demonstrating how 3D sketch maps can support people’s externalisation of their 3D spatial understanding. This study is part of a Sinergia project called “3D Sketch Maps”, funded by the Swiss National Science Foundation (SNSF) [grant number 202284].

The interface design of VResin


  • -

ETRA 2024

With Peter acting as one of the Full Paper Chairs, we have once more contributed to a successful ACM Symposium on Eye Tracking Research & Applications (ETRA 2024), which took place in Glasgow (U.K.) June 4-7. A total of 25 full papers were accepted, selected from 68 submissions after a rigorous reviewing process (37% acceptance rate).

The proceedings have been published as issues in PACM HCI and PACM CGIT.

Looking forward to ETRA 2025!


  • -

New publication in the International Journal of Geographical Information Science

In our recently published study in Taylor & Francis Online, we investigated the influence of uncertainty visualization on cognitive load in safety-critical and time-critical decision-making tasks. This research tackles the ever-present challenge faced by professionals in various fields – making critical choices under pressure, often with inherent spatial uncertainties. We focused on identifying the most effective visualization techniques to support these decision-makers. Our study specifically examined the effectiveness of different uncertainty visualization techniques in traffic management tasks through eye tracking. Furthermore, through gaze transition entropy, we explore the differences between different strategies employed by the decision-makers. This research opens doors for further exploration in visualization design for decision support systems.


  • -

Eyes4ICU at LBS 2023

In the scope of the MSCA Doctoral Network Eyes4ICU, our doctoral students Lin Che and Yiwei Wang are investigating novel ways of using eye tracking for the improvement of location-based services. They presented and discussed their research at the 18th Conference on Location Based Services in Ghent, Belgium, last week.

Congrats, Lin, for receiving the best short paper award!

Work-in-progress papers (DOI assignment pending):

  • Che, L., Raubal, M., and Kiefer, P. (2023) Towards Personalized Pedestrian Route Recommendation Based on Implicit Visual Preference. In: Huang, H., Van de Weghe, N., and Gartner, G. (editors), Proceedings of the 18th International Conference on Location Based Services, Ghent, Belgium (to appear) [PDF]
  • Wang, Y., Raubal, M., and Kiefer, P. (2023) Towards gaze-supported emotion-enhanced travel experience logging. In: Huang, H., Van de Weghe, N., and Gartner, G. (editors), Proceedings of the 18th International Conference on Location Based Services, Ghent, Belgium (to appear) [PDF]

 

 


  • -

“Do You Need Instructions Again? Predicting Wayfinding Instruction Demand”

In collaboration with colleagues from TU Vienna, we have published a full paper in the proceedings of this year’s GIScience conference, taking place next week in Leeds, U.K.:

The demand for instructions during wayfinding can be considered as an important indicator of the internal cognitive processes during wayfinding. In the paper, we are predicting instruction demand in a real-world wayfinding experiment with 45 participants using different environmental, user, instructional, and gaze-related features. Being able to predict instruction demand can, for instance, be beneficial for navigation systems that adapt instructions in real-time, based on their users’ behavior.

Alinaghi, N., Kwok, T. C., Kiefer, P., & Giannopoulos, I. (2023). Do You Need Instructions Again? Predicting Wayfinding Instruction Demand. In 12th International Conference on Geographic Information Science (GIScience 2023). Schloss Dagstuhl-Leibniz-Zentrum für Informatik.

Download PDF


  • -

ETRA 2024: Call for Papers

Peter Kiefer will be serving as one of the Full Paper Chairs for the ETRA 2024 conference, taking place in Glasgow, U.K., 4-7 June 2024.

Please check out the Call for Papers!

The geoGAZElab has been part of the ETRA community for many years. We’re happy to continue contributing to this vibrant community, pushing forward excellent research in this exciting field.


  • -

New abstract in Resilience 2023- Using eye tracking for enhancing resilience in control rooms

We are pleased to announce that our abstract titled “Using eye tracking for enhancing resilience in control rooms” has been accepted for presentation at the Resilience 2023 conference.

In this abstract, we highlight three principle ways how eye tracking can support decision makers in control rooms: 1) Evaluation of information visualization for decision support systems, 2) unobtrusive assessment of decision-makers’ cognitive state, and 3) supporting distributed cognition through information sharing using eye-gaze.

In our talk, we’ll be presenting our current research progress in each of these directions. We look forward to seeing you at the Resilience 2023 conference in Mexico.

This project has received funding from the Future Resilience Systems program at the Singapore-ETH Center.


  • -

New article in Applied Ergonomics – The effect of flight phase on electrodermal activity and gaze behavior: A simulator study

Our article “The effect of flight phase on electrodermal activity and gaze behavior: A simulator study” has been accepted for publication in the journal Applied Ergonomics:

Luis Lutnyk, David Rudi, Victor R. Schinazi, Peter Kiefer, Martin Raubal (2022). The effect of flight phase on electrodermal activity and gaze behavior: A simulator study , Applied Ergonomics, Volume 109, DOI: 10.1016/j.apergo.2023.103989 .

Highlights:

  • Unobtrusive technologies were used to record electrodermal activity and gaze behavior in an instrument failure scenario.
  • Participants’ electrodermal activity increased significantly during high workload phases of the failure scenario.
  • AOI-based & non-AOI eye tracking metrics show significant differences when a secondary task needs to be solved during flight.
  • The observed measures show great potential for future cockpits that can provide assistance based on the sensed pilot state.

Abstract. Current advances in airplane cockpit design and layout are often driven by a need to improve the pilot’s awareness of the aircraft’s state. This involves an improvement in the flow of information from aircraft to pilot. However, providing the aircraft with information on the pilot’s state remains an open challenge. This work takes a first step towards determining the pilot’s state based on biosensor data. We conducted a simulator study to record participants’ electrodermal activity and gaze behavior, indicating pilot state changes during three distinct flight phases in an instrument failure scenario. The results show a significant difference in these psychophysiological measures between a phase of regular flight, the incident phase, and a phase with an additional troubleshooting task after the failure. The differences in the observed measures suggest great potential for a pilot-aware cockpit that can provide assistance based on the sensed pilot state.

The article has been published as Open Access and you can get the PDF here:
https://www.sciencedirect.com/science/article/pii/S0003687023000273

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

New Article Published in Human-Computer Interaction

Our article “Unobtrusive interaction: a systematic literature review and expert survey” has been accepted for publication by the Human–Computer Interaction (HCI):

Tiffany C.K. Kwok, Peter Kiefer & Martin Raubal (2023). Unobtrusive interaction: a systematic literature review and expert survey, Human–Computer Interaction, DOI: 10.1080/07370024.2022.2162404

Abstract. Unobtrusiveness has been highlighted as an important design principle in Human-Computer Interaction (HCI). However, the understanding of unobtrusiveness in the literature varies. Researchers often claim unobtrusiveness for their interaction method based on their understanding of what unobtrusiveness means. This lack of a shared definition hinders effective communication in research and impedes comparability between approaches. In this article, we approach the question “What is unobtrusive interaction?” with a systematic and extensive literature review of 335 papers and an online survey with experts. We found that not a single definition of unobtrusiveness is universally agreed upon. Instead, we identify five working definitions from the literature and experts’ responses. We summarize the properties of unobtrusive interaction into a design framework with five dimensions and classify the reviewed papers with regard to these dimensions. The article aims to provide researchers with a more unified context to compare their work and identify opportunities for future research.

The article will appear in one of the next issues of the Human–Computer Interaction. It has been published as Open Access and you can get the article here:
https://doi.org/10.1080/07370024.2022.2162404


  • -

Full Paper published at ICMI 2022

Our paper “Two-Step Gaze Guidance” has been published in the proceedings of the International Conference on Multimodal Interaction (ICMI ’22) as a full paper.

Tiffany C.K. Kwok, Peter Kiefer, Martin Raubal (2022). Two-Step Gaze Guidance, International Conference on Multimodal Interaction (ICMI ’22), DOI: 10.1145/3536221.3556612

Abstract. One challenge of providing guidance for search tasks consists in guiding the user’s visual attention to certain objects in a potentially large search space. Previous work has tried to guide the user’s attention by providing visual, audio, or haptic cues. The state-of-the-art methods either provide hints pointing towards the approximate direction of the target location for a fast but less accurate search or require the user to perform a fine-grained search from the beginning for a precise yet less efficient search. To combine the advantage of both methods, we propose an interaction concept called Two-Step Gaze Guidance. The first-step guidance focuses on quick guidance toward the approximate direction, and the second-step guidance focuses on fine-grained guidance toward the exact location of the target. A between-subject study (N = 69) with five conditions was carried out to compare the two-step gaze guidance method with the single-step gaze guidance method. Results revealed that the proposed method outperformed the single-step gaze guidance method. More precisely, the introduction of Two-Step Gaze Guidance slightly improves the searching accuracy, and the use of spatial audio as the first-step guidance significantly helps in enhancing the searching efficiency. Our results also indicated several design suggestions for designing gaze guidance methods.


  • -

FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight

Our article “FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight” has been accepted for publication by the International Journal of Human–Computer Interaction (IJHCI):

Luis Lutnyk, David Rudi, Emanuel Meier, Peter Kiefer, Martin Raubal (2022). FlyBrate: Evaluating Vibrotactile Cues for Simulated Flight , International Journal of Human–Computer Interaction, DOI: 10.1080/10447318.2022.2075627.

Abstract. Contemporary aircraft cockpits rely mostly on audiovisual information propagation which can overwhelm particularly novice pilots. The introduction of tactile feedback, as a less taxed modality, can improve the usability in this case. As part of a within-subject simulator study, 22 participants are asked to fly a visual-flight-rule scenario along a predefined route and identify objects in the outside world that serve as waypoints. Participants fly two similar scenarios with and without a tactile belt that indicates the route. Results show that with the belt, participants perform better in identifying objects, have higher usability and user experience ratings, and a lower perceived cognitive workload, while showing no improvement in spatial awareness. Moreover, 86\% of the participants state that they prefer flying with the tactile belt. These results suggest that a tactile belt provides pilots with an unobtrusive mode of assistance for tasks that require orientation using cues from the outside world.

The article will appear in one of the next issues of the International Journal of Human–Computer Interaction.

It has been published as Open Access and you can get the PDF here:
https://www.tandfonline.com/doi/full/10.1080/10447318.2022.2075627


  • -

Co-authored paper on PEGGASUS at SPIE Photonics West

A joint paper with our partners at CSEM on the eye tracking hard- and software developed in PEGGASUS was published on the SPIE digital library. We encourage you to take a deep dive into the technological achievements of PEGGASUS:

“The pipeline, which is a combination of data-driven and analytics approaches, runs in real time at 60 fps with a latency of about 32ms. The eye gaze estimation error was evaluated in terms of the point of regard distance error with respect to the 3D point location. An average error of less than 1.1cm was achieved over 28 gaze points representing the cockpit instruments placed at about 80-110cm from the participants’ eyes.”

Engin Türetkin, Sareh Saeedi, Siavash Bigdeli, Patrick Stadelmann, Nicolas Cantale, Luis Lutnyk, Martin Raubal, Andrea L. Dunbar (2022). Real time eye gaze tracking for human machine interaction in the cockpit In AI and Optical Data Sciences III (Vol. 12019, pp. 24-33). SPIE..

The paper was presented at SPIE’s Photonics West conference at San Francisco’s Moscone Center.

Find the full text and presentation video here: https://doi.org/10.1117/12.2607434

The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Full Paper presentation at ETRA 2021

Our accepted paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” will be presented at ACM ETRA 2021:

May 25.2021 at 11:00 – 12:00 and 18:00 – 19:00 in “Posters & Demos & Videos”
May 26.2021 at 14:4516.15 in Track 1: “Full Papers V”

Join the virtual conference for a chat!
https://etra.acm.org/2021/schedule.html


  • -

Book Chapter on Outdoor HCI accepted

Kiefer, P., Adams, B., Kwok, T., Raubal, M. (2020) Modeling Gaze-Guided Narratives for Outdoor Tourism. In: McCrickard, S., Jones, M., and Stelter, T. (eds.): HCI Outdoors: Theory, Design, Methods and Applications. Springer International Publishing (in print)


  • -

New article on iAssyst

The instructor assistant system (iAssyst) that we developed as part of our research collaboration with Swiss International Air Lines is being featured in an article by innoFRAtor, the innovation portal of the Fraport AG.

You may read more about the system in our related research article: Rudi D., Kiefer P., and Raubal M. (2020). The Instructor Assistant System (iASSYST) – Utilizing Eye Tracking for Commercial Aviation Training Purposes. Ergonomics, vol. 63: no. 1, pp. 61-​79, London: Taylor & Francis, 2020. DOI: https://doi.org/10.1080/00140139.2019.1685132

Our project on Enhanced flight training program for monitoring aircraft automation with Swiss International Air Lines, NASA, and the University of Oregon was officially concluded end of last year.


  • -

Workshop Paper published from ETAVI 2020

Our paper Towards Pilot-Aware Cockpits has been published in the proceedings of the 1st International Workshop on Eye-Tracking in Aviation (ETAVI 2020):

Lutnyk L., Rudi D., and Raubal M. (2020). Towards Pilot-​Aware Cockpits. In Proceedings of the 1st International Workshop on Eye-​Tracking in Aviation (ETAVI 2020), ETH Zurich. DOI: https://doi.org/10.3929/ethz-b-000407661

Abstract. Eye tracking has a longstanding history in aviation research. Amongst others it has been employed to bring pilots back “in the loop”, i.e., create a better awareness of the flight situation. Interestingly, there exists only little research in this context that evaluates the application of machine learning algorithms to model pilots’ understanding of the aircraft’s state and their situation awareness. Machine learning models could be trained to differentiate between normal and abnormal patterns with regard to pilots’ eye movements, control inputs, and data from other psychophysiological sensors, such as heart rate or blood pressure. Moreover, when the system recognizes an abnormal pattern, it could provide situation specific assistance to bring pilots back in the loop. This paper discusses when pilots benefit from such a pilot-aware system, and explores the technical and user oriented requirements for implementing this system.

Edit. The publication is part of PEGGASUS. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Full Paper accepted at ETRA 2020

Our paper “Gaze-Adaptive Lenses for Feature-Rich Information Spaces” has been accepted at ACM ETRA 2020 as a full paper:

Göbel, F., Kurzhals K., Schinazi V. R., Kiefer, P., and Raubal, M. (2020). Gaze-Adaptive Lenses for Feature-Rich Information Spaces. In Proceedings of the 12th ACM Symposium on Eye Tracking Research & Applications (ETRA ’20), ACM. DOI: https://doi.org/10.1145/3379155.3391323


  • -

Workshop Paper accepted at CHI 2020

Our workshop contribution “Gaze-Aware Mixed-Reality: Addressing Privacy Issues with Eye Tracking” has been accepted at the “Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI” at ACM CHI 2020:

Fabian Göbel, Kuno Kurzhals Martin Raubal and Victor R. Schinazi (2020). Gaze-Aware Mixed-Reality: Addressing Privacy Issues with Eye Tracking.
In CHI 2020 Workshop on Exploring Potentially Abusive Ethical, Social and Political Implications of Mixed Reality in HCI (CHI 2020), ACM.


  • -

ETAVI 2020 Proceedings Online

The proceedings of the 1st International Workshop on Eye-Tracking in Aviation (ETAVI) 2020 have been published here.

We thank all program committee members for their efforts and great support, it is very much appreciated.

Furthermore, we thank all authors of the 15 excellent articles that were accepted for ETAVI 2020. We regret that the event had to be cancelled due to COVID-19.

Edit. Some of the organizers from ETH are part of the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Full Paper accepted at CHI 2020

Our paper “A View on the Viewer: Gaze-Adaptive Captions for Videos” has been accepted at ACM CHI 2020 as a full paper:

Kurzhals K., Göbel F., Angerbauer K., Sedlmair M., Raubal M. (2020) A View on the Viewer: Gaze-Adaptive Captions for Videos. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2020), ACM (accepted)

 

Abstract. Subtitles play a crucial role in cross-lingual distribution of multimedia content and help communicate information where auditory content is not feasible (loud environments, hearing impairments, unknown languages). Established methods utilize text at the bottom of the screen, which may distract from the video. Alternative techniques place captions closer to related content (e.g., faces) but are not applicable to arbitrary videos such as documentations. Hence, we propose to leverage live gaze as indirect input method to adapt captions to individual viewing behavior. We implemented two gaze-adaptive methods and compared them in a user study (n=54) to traditional captions and audio-only videos. The results show that viewers with less experience with captions prefer our gaze-adaptive methods as they assist them in reading. Furthermore, gaze distributions resulting from our methods are closer to natural viewing behavior compared to the traditional approach. Based on these results, we provide design implications for gaze-adaptive captions.


  • -

Full paper published – Ergonomics Journal

Our paper “The instructor assistant system (iASSYST) – utilizing eye tracking for commercial aviation training purposes” has been published in the Ergonomics Journal:

Rudi, D., Kiefer, P., & Raubal, M. (2019). The instructor assistant system (iASSYST)-utilizing eye tracking for commercial aviation training purposes. Ergonomics.

Abstract. This work investigates the potential of providing commercial aviation flight instructors with an eye tracking enhanced observation system to support the training process. During training, instructors must deal with many parallel tasks, such as operating the flight simulator, acting as air traffic controllers, observing the pilots and taking notes. This can cause instructors to miss relevant information that is crucial for debriefing the pilots. To support instructors, the instructor ASsistant SYSTem (iASSYST) was developed. It includes video, audio, simulator and eye tracking recordings. iASSYST was evaluated in a study involving 7 instructors. The results show that with iASSYST, instructors were able to support their observations of errors, find new errors, determine that some previously identified errors were not errors, and to reclassify the types of errors that they had originally identified. Instructors agreed that eye tracking can help identifying causes of pilot error.


  • -

Gaze-based interactions in the cockpit of the future: a survey

An article titled “Gaze-based interactions in the cockpit of the future: a survey” will appear in one of the next issues of the Journal on Multimodal User Interfaces. It is now available online:

Abstract Flying an aircraft is a mentally demanding task where pilots must process a vast amount of visual, auditory and vestibular information. They have to control the aircraft by pulling, pushing and turning different knobs and levers, while knowing that mistakes in doing so can have fatal outcomes. Therefore, attempts to improve and optimize these interactions should not increase pilots’ mental workload. By utilizing pilots’ visual attention, gaze-based interactions provide an unobtrusive solution to this. This research is the first to actively involve pilots in the exploration of gaze-based interactions in the cockpit. By distributing a survey among 20 active commercial aviation pilots working for an internationally operating airline, the paper investigates pilots’ perception and needs concerning gaze-based interactions. The results build the foundation for future research, because they not only reflect pilots’ attitudes towards this novel technology, but also provide an overview of situations in which pilots need gaze-based interactions.


  • -

Meet us at CHI 2019

We’ll present one full paper and two workshop position papers at CHI in Glasgow this year:

Workshop: Designing for Outdoor Play (4th May, Saturday – 08:00 – 14:00, Room: Alsh 1)

Kiefer, P.(2019) Gaze-guided narratives for location-based games. In CHI 2019 Workshop on “Designing for Outdoor Play”, Glasgow, U.K., DOI: https://doi.org/10.3929/ethz-b-000337913

Workshop: Challenges Using Head-Mounted Displays in Shared and Social Spaces (5th May, Sunday – 08:00 – 14:00, Room: Alsh 2)

Göbel, F., Kwok, T.C.K., and Rudi, D.(2019) Look There! Be Social and Share. In CHI 2019 Workshop on “Challenges Using Head-Mounted Displays in Shared and Social Spaces”, Glasgow, U.K., DOI: https://doi.org/10.3929/ethz-b-000331280

Paper Session: Audio Experiences(8th May, Wednesday – 14:00 – 15:20, Room: Alsh 1)

Kwok, T.C.K., Kiefer, P., Schinazi, V.R., Adams, B., and Raubal, M. (2019) Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Maz 4-9, Glasgow, U.K. [PDF]

We are looking forward to seeing you in Glasgow!
These researches are part of the LAMETTA or IGAMaps projects.


  • -

FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps

An article titled “FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

FeaturEyeTrack: automatic matching of eye tracking data with map features on interactive maps

Abstract Map reading is a visual task that can strongly vary between individuals and maps of different characteristics. Aspects such as where, when, how long, and in which sequence information on a map is looked at can reveal valuable insights for both the map design process and to better understand cognitive processes of the map user. Contrary to static maps, for which many eye tracking studies are reported in the literature, established methods for tracking and analyzing visual attention on interactive maps are yet missing. In this paper, we present a framework called FeaturEyeTrack that allows to automatically log the cartographic features that have been inspected as well as the mouse input during the interaction with digital interactive maps. In particular, the novelty of FeaturEyeTrack lies in matching of gaze with the vector model of the current map visualization, therefore enabling a very detailed analysis without the requirement for manual annotation. Furthermore, we demonstrate the benefits of this approach in terms of manual work, level of detail and validity compared to state-of-the-art methods through a case study on an interactive cartographic web map.


  • -

Full Paper accepted at CHI 2019

Our paper “Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments” has been accepted by ACM CHI 2019 as a full paper:

Kwok, T.C.K., Kiefer, P., Schinazi, V.R., Adams, B., and Raubal, M. (2019) Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), ACM (accepted)

This paper proposes Gaze-Guided Narratives as an implicit gaze-based interaction concept for guiding tourists through the hidden stories of a city panorama. It reports on the implementation and evaluation of this concept which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Victor R. Schinazi (Chair of Cognitive Science, ETH Zurich) and Benjamin Adams (Department of Geography, University of Canterbury).

Abstract. Exploring a city panorama from a vantage point is a popular tourist activity. Typical audio guides that support this activity are limited by their lack of responsiveness to user behavior and by the difficulty of matching audio descriptions to the panorama. These limitations can inhibit the acquisition of information and negatively affect user experience. This paper proposes Gaze-Guided Narratives as a novel interaction concept that helps tourists find specific features in the panorama (gaze guidance) while adapting the audio content to what has been previously looked at (content adaptation). Results from a controlled study in a virtual environment (n=60) revealed that a system featuring both gaze guidance and content adaptation obtained better user experience, lower cognitive load, and led to better performance in a mapping task compared to a classic audio guide. A second study with tourists situated at a vantage point (n=16) further demonstrated the feasibility of this approach in the real world.


  • -

Short Paper and Workshop Paper accepted at GIScience 2018

We are happily announcing that two of our papers have been accepted at GIScience 2018 and the Workshop on Spatial big data and machine learning in GIScience:

Fabian Göbel, Peter Kiefer, Ioannis Giannopoulos and Martin Raubal. 2018. Gaze Sequences and Map Task Complexity. GIScience 2018, Melbourne, Australia.

Fabian Göbel, Henry Martin. 2018. Unsupervised Clustering of Eye Tracking Data. Spatial big data and machine learning in GIScience, Workshop at GIScience 2018, Melbourne, Australia.

Both works are part of the IGAMaps project.