• eyetracking@ethz.ch
  • +41 44 633 71 59

Category Archives: LAMETTA

  • -

Article in Scientific Reports: Mind Wandering

Do you enjoy touristic audio guides? You don’t? One reason could be that classic audio guides do not adapt to your cognitive state in real time.

In our most recent article in Scientific Reports, we investigate the detection of mind wandering from eye movements for combined audio-visual stimuli, such as touristic audio guides. We trained and validated classifiers which are able to successfully detect mind wandering in a 1-s time window. Future audio guides could use our results for providing more intelligent assistance.

Kwok, T. C. K., Kiefer, P., Schinazi, V. R., Hoelscher, C., & Raubal, M. (2024). Gaze-based detection of mind wandering during audio-guided panorama viewing. Scientific Reports, 14(1), 27955. https://doi.org/10.1038/s41598-024-79172-x

We wish you a fully focused (= non-mind wandering) reading experience!

This research is part of Tiffany Kwok’s (see former group members) doctoral thesis and an outcome of the LAMETTA project, which has been funded through an ETH Zurich Research Grant.


  • -

New Article Published in Human-Computer Interaction

Our article “Unobtrusive interaction: a systematic literature review and expert survey” has been accepted for publication by the Human–Computer Interaction (HCI):

Tiffany C.K. Kwok, Peter Kiefer & Martin Raubal (2023). Unobtrusive interaction: a systematic literature review and expert survey, Human–Computer Interaction, DOI: 10.1080/07370024.2022.2162404

Abstract. Unobtrusiveness has been highlighted as an important design principle in Human-Computer Interaction (HCI). However, the understanding of unobtrusiveness in the literature varies. Researchers often claim unobtrusiveness for their interaction method based on their understanding of what unobtrusiveness means. This lack of a shared definition hinders effective communication in research and impedes comparability between approaches. In this article, we approach the question “What is unobtrusive interaction?” with a systematic and extensive literature review of 335 papers and an online survey with experts. We found that not a single definition of unobtrusiveness is universally agreed upon. Instead, we identify five working definitions from the literature and experts’ responses. We summarize the properties of unobtrusive interaction into a design framework with five dimensions and classify the reviewed papers with regard to these dimensions. The article aims to provide researchers with a more unified context to compare their work and identify opportunities for future research.

The article will appear in one of the next issues of the Human–Computer Interaction. It has been published as Open Access and you can get the article here:
https://doi.org/10.1080/07370024.2022.2162404


  • -

Full Paper published at ICMI 2022

Our paper “Two-Step Gaze Guidance” has been published in the proceedings of the International Conference on Multimodal Interaction (ICMI ’22) as a full paper.

Tiffany C.K. Kwok, Peter Kiefer, Martin Raubal (2022). Two-Step Gaze Guidance, International Conference on Multimodal Interaction (ICMI ’22), DOI: 10.1145/3536221.3556612

Abstract. One challenge of providing guidance for search tasks consists in guiding the user’s visual attention to certain objects in a potentially large search space. Previous work has tried to guide the user’s attention by providing visual, audio, or haptic cues. The state-of-the-art methods either provide hints pointing towards the approximate direction of the target location for a fast but less accurate search or require the user to perform a fine-grained search from the beginning for a precise yet less efficient search. To combine the advantage of both methods, we propose an interaction concept called Two-Step Gaze Guidance. The first-step guidance focuses on quick guidance toward the approximate direction, and the second-step guidance focuses on fine-grained guidance toward the exact location of the target. A between-subject study (N = 69) with five conditions was carried out to compare the two-step gaze guidance method with the single-step gaze guidance method. Results revealed that the proposed method outperformed the single-step gaze guidance method. More precisely, the introduction of Two-Step Gaze Guidance slightly improves the searching accuracy, and the use of spatial audio as the first-step guidance significantly helps in enhancing the searching efficiency. Our results also indicated several design suggestions for designing gaze guidance methods.


  • -

PhD graduation Tiffany C.K. Kwok

We congratulate Tiffany C.K. Kwok for successfully completing her doctoral thesis on “Designing Unobtrusive Gaze-Based Interactions: Applications to Audio-Guided Panorama Viewing”. The doctoral graduation has been approved by the Department conference in their last meeting. The research was performed in the scope of the LAMETTA project.

Tiffany is staying with us for a PostDoc, continuing her research in the geoGAZElab. It’s great having you in our team, Tiffany!


  • -

Book Chapter on Outdoor HCI accepted

Kiefer, P., Adams, B., Kwok, T., Raubal, M. (2020) Modeling Gaze-Guided Narratives for Outdoor Tourism. In: McCrickard, S., Jones, M., and Stelter, T. (eds.): HCI Outdoors: Theory, Design, Methods and Applications. Springer International Publishing (in print)


  • -

Visit by the Vice President for Research and Corporate Relations

On 11 September, Prof. Dr. Detlef Günther, the Vice President for Research and Corporate Relations of ETH Zurich, has visited the D-BAUG department and informed himself about the exciting research activities of the different institutes.

Our institute was represented by Peter Kiefer, who summarized the research of the GeoGazeLab. The slides provide an overview on our research interests and current projects.

Edit. The presentation includes the PEGGASUS project. This project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation program under grant agreement No. 821461


  • -

Meet us at CHI 2019

We’ll present one full paper and two workshop position papers at CHI in Glasgow this year:

Workshop: Designing for Outdoor Play (4th May, Saturday – 08:00 – 14:00, Room: Alsh 1)

Kiefer, P.(2019) Gaze-guided narratives for location-based games. In CHI 2019 Workshop on “Designing for Outdoor Play”, Glasgow, U.K., DOI: https://doi.org/10.3929/ethz-b-000337913

Workshop: Challenges Using Head-Mounted Displays in Shared and Social Spaces (5th May, Sunday – 08:00 – 14:00, Room: Alsh 2)

Göbel, F., Kwok, T.C.K., and Rudi, D.(2019) Look There! Be Social and Share. In CHI 2019 Workshop on “Challenges Using Head-Mounted Displays in Shared and Social Spaces”, Glasgow, U.K., DOI: https://doi.org/10.3929/ethz-b-000331280

Paper Session: Audio Experiences(8th May, Wednesday – 14:00 – 15:20, Room: Alsh 1)

Kwok, T.C.K., Kiefer, P., Schinazi, V.R., Adams, B., and Raubal, M. (2019) Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Maz 4-9, Glasgow, U.K. [PDF]

We are looking forward to seeing you in Glasgow!
These researches are part of the LAMETTA or IGAMaps projects.


  • -

ETIZ Meeting March 2019 – Impressions

More than 30 participants attended the meeting of the Eye Tracking Interest Group Zurich (ETIZ) hosted by us on 26 March 2019. Our invited speaker Andreas Bulling (University of Stuttgart) provided insights into his current and past research on pervasive eye tracking. Tiffany Kwok (GeoGazeLab, LAMETTA project) presented her PhD research on gaze-guided narratives. In an interactive mini-workshop, moderated by Arzu Çöltekin (FHNW), attendees brainstormed about challenges of eye tracking in VR and AR displays. Discussions were continued during an apéro, and many took the opportunity to try out a gaze-adaptive map demo (Fabian Göbel, GeoGazeLab, IGAMaps project).


  • -

ETIZ Meeting March 2019

We are going to host the next meeting of the Eye Tracking Interest Group Zurich (ETIZ). Everyone using, or planning to use eye tracking in their research is cordially welcome!

Date, time: 26th March 2019, 17:30
Place: ETH Zurich Hönggerberg, HIL D 53

 

17:30 – 17:35
Welcome

17:35 – 18:15
“Recent Advances Towards Pervasive Eye Tracking”
Prof. Dr. Andreas Bulling, Professor for Human-Computer Interaction and Cognitive Systems
University of Stuttgart, Germany

18:15 – 18:35
“Gaze-Guided Narratives”
Tiffany C.K. Kwok, Doctoral Student
Geoinformation Engineering, ETH Zurich

18:35 – 18:55
“Eye tracking in VR and AR displays: A mini-workshop”
Dr. Arzu Çöltekin, Assoc. Prof., Principal Investigator
Institute for Interactive Technologies IIT, University of Applied Sciences and Arts Northwestern Switzerland FHNW

18:55 – 19:00
Closing

19:00
Apéro, with demo of a gaze-adaptive interactive map by Fabian Göbel, Geoinformation Engineering


  • -

Full Paper accepted at CHI 2019

Our paper “Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments” has been accepted by ACM CHI 2019 as a full paper:

Kwok, T.C.K., Kiefer, P., Schinazi, V.R., Adams, B., and Raubal, M. (2019) Gaze-Guided Narratives: Adapting Audio Guide Content to Gaze in Virtual and Real Environments. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), ACM (accepted)

This paper proposes Gaze-Guided Narratives as an implicit gaze-based interaction concept for guiding tourists through the hidden stories of a city panorama. It reports on the implementation and evaluation of this concept which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Victor R. Schinazi (Chair of Cognitive Science, ETH Zurich) and Benjamin Adams (Department of Geography, University of Canterbury).

Abstract. Exploring a city panorama from a vantage point is a popular tourist activity. Typical audio guides that support this activity are limited by their lack of responsiveness to user behavior and by the difficulty of matching audio descriptions to the panorama. These limitations can inhibit the acquisition of information and negatively affect user experience. This paper proposes Gaze-Guided Narratives as a novel interaction concept that helps tourists find specific features in the panorama (gaze guidance) while adapting the audio content to what has been previously looked at (content adaptation). Results from a controlled study in a virtual environment (n=60) revealed that a system featuring both gaze guidance and content adaptation obtained better user experience, lower cognitive load, and led to better performance in a mapping task compared to a classic audio guide. A second study with tourists situated at a vantage point (n=16) further demonstrated the feasibility of this approach in the real world.


  • -

LAMETTA at GeoSummit: Visit by Federal Councillor Guy Parmelin

We have presented the LAMETTA project at the GeoSummit in Bern (6-7 June 2018), the largest congress for geoinformatics, surveying and planning in Switzerland.

Federal councilor Guy Parmelin was one of the first visitors of our exhibit and was very interested in the innovative system. Due to his subsequent opening speech, there was no time to try out the gaze-based tourist guide to Lake Lucerne himself, but the short visit seemed already impressive.

A large number of visitors from both, industry and academia, visited our exhibit and tried out the system. In addition, our exhibit was part of the GeoSchoolDay – an event in conjunction with GeoSummit which introduces students at high school age to applications and opportunities of geo information technologies. Approx. 500 pupils visited LAMETTA and learned about eye tracking and its application in interactive systems.


  • -

Science City March 2018 – Impressions

The LAMETTA project has been demoed at this year’s “Treffpunkt Science City” event, an educational program of ETH Zurich for the general public where more than 3,000 visitors came.

Our panorama wall installation and the LAMETTA software allowed our visitors to experience as if they were exploring the view from a vantage point. Just by looking at the interested area (such as lakes, mountains and villages), our system can provide related information to the user.


  • -

Meeting point Science City – March 2018

We’re excited to demonstrate the LAMETTA project at ETH Treffpunkt Science City, the educational programs of ETH Zurich for all. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (more details)!

You can find us Sunday, 25 March in ETH Hönggerberg HCI, Room E2.


  • -

Position Paper at CHI Workshop on Outdoor HCI

We’ll present our ideas on how to enrich a tourist’s experience with gaze-guided narratives at a CHI workshop in Montreal this year:

Kiefer, P., Adams, B., and Raubal, M. (2018) Gaze-Guided Narratives for Outdoor Tourism. HCI Outdoors: A CHI 2018 Workshop on Understanding Human-Computer Interaction in the Outdoors

This research is part of the LAMETTA project.


  • -

Tiffany Kwok joins the team

We welcome Tiffany as a new PhD student in the LAMETTA project!

[Current Team]


  • -

Scientifica 2017 – Impressions

We have demoed the LAMETTA project at this year’s Scientifica, the Zurich science exhibition of ETH Zurich and University of Zurich with more than 30,000 visitors.

Our panorama wall installation enabled visitors to query information about lakes, mountains, and villages just by looking at them. Many visitors found the hidden treasure in the panorama and were rewarded with a piece of Swiss chocolate.

Visitors of all ages tried out and learned more about gaze-based interaction and mobile eye tracking technology. We are happy that so many people were interested and eager to discuss our research and potential applications to tourist guides of the future.


  • -

GeoGazeLab at Scientifica 2017

We’re excited to present the LAMETTA project at Scientifica, the science fair of ETH Zurich and University of Zurich. Come and try out an interactive mobile eye tracking system! Explore a mountain panorama and interact with it only by using your gaze (details in German)!

You can find us Friday, 1 September to Sunday, 3 September at University of Zurich main building (West Foyer).

Check out our Scientifica video!


  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

Gaze-Informed Location Based Services

Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):

Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017).  Gaze-informed location based services.  International Journal of Geographical Information Science, 2017. (accepted), PDF

The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).

Abstract
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.


  • -

Special Issue Appearing: Spatial Cognition&Computation 17 (1-2)

A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].

Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.

The Special Issue commences with an overview article, authored by the Guest Editors: “Eye tracking for spatial research: Cognition, computation, challenges” [URL, PDF].


  • -

Paper accepted at PETMEI Workshop

Vasileios-Athanasios Anagnostopoulos and Peter Kiefer (2016). Towards gaze-based interaction with urban outdoor spaces.  In 6th International Workshop on Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2016), UbiComp’16 Adjunct, New York, NY, USA, ACM. accepted


  • -

ETIZ Meeting March 2016 – Impressions

The two talks on cognitive load given by Christoph Hölscher and Andrew Duchowski provided great insights that stimulated discussion at the Eye Tracking Interest Group Zurich (ETIZ), which was one of the most highly attended meetings.hoelscher-etiz


  • -

ETIZ Meeting March 2016

We are going to host the next meeting of the Eye Tracking Interest Group Zurich (ETIZ). Everyone using, or planning to use eye tracking in their research is cordially welcome!

Date, time: Wednesday, 23rd March 2016, 17:00-19:00
Place: ETH Zurich Hönggerberg, HIL G 22
Topic: Measuring Cognitive Load with Eye Tracking

Please sign up in the Doodle to allow us plan the coffee break: http://ethz.doodle.com/poll/6ti5qbqx23wvf53g (before 16 March)

 

17:00 – 17:05
Welcome

17:05 – 17:20
Cognitive Load: Introduction
Christoph Hölscher, Chair of Cognitive Science, ETH Zürich

17:20 – 17:45
Cognitive Load and Eye Tracking: Overview on Methods
Andrew Duchowski, School of Computing, Clemson University, S.C., USA

17:45 – 18:15
Break
with possibility to try out a mobile gaze-based interaction system
Vasilis Anagnostopoulos, LAMETTA project, Geoinformation Engineering ETH Zürich

18:15 – 18:45
Discussion: Cognitive Load

18:45 – 18:55
Discussion: Format of ETIZ meeting


  • -

Vasilis Anagnostopoulos joins the team

Vasilis Anagnostopoulos has started as a PhD student in the LAMETTA project (Location-Aware Mobile Eye Tracking for Tourist Assistance).

Welcome to our team, Vasilis!

[Current Team]


  • -

Open PhD position

We are looking for a PhD candidate (LAMETTA project).

More details and application on the ETH website.


  • -

ETH Zurich Research Grant

Great news right before the holiday season!

ETH Zurich will support our research on “Location-Aware Mobile Eye Tracking for Tourist Assistance” (LAMETTA) with an ETH Zurich Research Grant for a 3-year project, starting in 2015 (PI: Peter Kiefer).

The project will pioneer gaze-based interaction techniques for tourists in outdoor environments. The project envisions mobile assistance systems that trigger information services based on the user’s gaze on touristic areas of interest. For instance, a gaze-based recommender system could notify the observer of a city panorama about buildings that match her interest, given the objects she has looked at before. The main objective of this project consists in the investigation of novel gaze-based interaction methods for tourists exploring a city panorama.

Stay tuned for updates on LAMETTA!

lametta1