• eyetracking@ethz.ch
  • +41 44 633 71 59

Category Archives: Publication

  • -

Best Paper at ETVIS 2018

We are happy to announce, that our paper received the best paper award at ETVIS!

David Rudi, Peter Kiefer, and Martin Raubal. 2018. Visualizing Pilot Eye Movements for Flight Instructors. In ETVIS’18: 3rdWorkshop on Eye Tracking and Visualization

The paper is part of the Awareness in Aviation project.


  • -

Papers accepted at ETRA and ETVIS

We are happy to announce, that two of our papers have been accepted at ETRA and ETVIS.

Fabian Göbel, Peter Kiefer, Ioannis Giannopoulos, Andrew T. Duchowski, and Martin Raubal. 2018. Improving Map Reading with Gaze-Adaptive Legends. In ETRA ’18: 2018 Symposium on Eye Tracking Research & Applications

David Rudi, Peter Kiefer, and Martin Raubal. 2018. Visualizing Pilot Eye Movements for Flight Instructors. In ETVIS’18: 3rdWorkshop on Eye Tracking and Visualization

These papers are part of the IGAMaps and Awareness in Aviation projects.

Peter Kiefer has further been involved in ETRA as an Area Chair.


  • -

Position Paper at CHI Workshop on Outdoor HCI

We’ll present our ideas on how to enrich a tourist’s experience with gaze-guided narratives at a CHI workshop in Montreal this year:

Kiefer, P., Adams, B., and Raubal, M. (2018) Gaze-Guided Narratives for Outdoor Tourism. HCI Outdoors: A CHI 2018 Workshop on Understanding Human-Computer Interaction in the Outdoors

This research is part of the LAMETTA project.


  • -

Full Paper accepted at CHI 2018

Andrew T. Duchowski, Krzysztof Krejtz, Izabela Krejtz, Cezary Biele, Anna Niedzielska, Peter Kiefer, Ioannis Giannopoulos, and Martin Raubal (2018). The Index of Pupillary Activity: Measuring Cognitive Load vis-à-vis Task Difficulty with Pupil Oscillation In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (CHI 2018), ACM (accepted)


  • -

An inverse-linear logistic model of the main sequence

Peter Kiefer and Ioannis Giannopoulos have contributed to an article titled “An inverse-linear logistic model of the main sequence” (Journal of Eye Movement Research, JEMR). It is now available online:

http://dx.doi.org/10.16910/jemr.10.3.4

Abstract. A model of the main sequence is proposed based on the logistic function. The model’s fit to the peak velocity-amplitude relation resembles an S curve, simulta- neously allowing control of the curve’s asymptotes at very small and very large amplitudes, as well as its slope over the mid amplitude range. The proposed inverse-linear logistic model is also able to express the linear relation of duration and amplitude. We demonstrate the utility and robustness of the model when fit to aggregate data at the small- and mid-amplitude ranges, namely when fitting microsaccades, saccades, and superposition of both. We are confident the model will suitably extend to the large-amplitude range of eye movements.


  • -

Gaze-Informed Location Based Services

Our article “Gaze-Informed Location Based Services” has been accepted for publication by the International Journal of Geographical Information Science (IJGIS):

Anagnostopoulos, V.-A., Havlena, M., Kiefer, P., Giannopoulos, I., Schindler, K., and Raubal, M. (2017).  Gaze-informed location based services.  International Journal of Geographical Information Science, 2017. (accepted), PDF

The article introduces the concept of location based services which take the user’s viewing direction into account. It reports on the implementation and evaluation of such gaze-informed location based service which has been developed as part of the LAMETTA project. This research has been performed in collaboration between the GeoGazeLab, Michal Havlena (Computer Vision Laboratory, ETH Zurich) and Konrad Schindler (Institute of Geodesy and Photogrammetry, ETH Zurich).

Abstract
Location-Based Services (LBS) provide more useful, intelligent assistance to users by adapting to their geographic context. For some services that context goes beyond a location and includes further spatial parameters, such as the user’s orientation or field of view. Here, we introduce Gaze-Informed LBS (GAIN-LBS), a novel type of LBS that takes into account the user’s viewing direction. Such a system could, for instance, provide audio information about the specific building a tourist is looking at from a vantage point. To determine the viewing direction relative to the environment, we record the gaze direction
relative to the user’s head with a mobile eye tracker. Image data from the tracker’s forward-looking camera serve as input to determine the orientation of the head w.r.t. the surrounding scene, using computer vision methods that allow one to estimate the relative transformation between the camera and a known view of the scene in real-time and without the need for artificial markers or additional sensors. We focus on how to map the Point of Regard of a user to a reference system, for which the objects of interest are known in advance. In an experimental validation on three real city panoramas, we confirm that the approach can cope with head movements of varying speed, including fast rotations up to 63 deg/s. We further demonstrate the feasibility of GAIN-LBS for tourist assistance with a proof-of-concept experiment in which a tourist explores a city panorama, where the approach achieved a recall that reaches over 99%. Finally, a GAIN-LBS can provide objective and qualitative ways of examining the gaze of a user based on what the user is currently looking at.


  • -

Short Paper accepted at AGILE conference

Fabian Göbel, Peter Kiefer and Martin Raubal (2017). FeaturEyeTrack: A Vector Tile-Based Eye Tracking Framework for Interactive Maps In Proceedings of the 20th International Conference on Geographic Information Science (AGILE 2017), Wageningen, The Netherlands. (accepted)


  • -

Special Issue Appearing: Spatial Cognition&Computation 17 (1-2)

A double Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition&Computation, guest-edited by Peter, Ioannis, Martin, and Andrew Duchowski, has appeared [URL].

Nineteen manuscripts were submitted to an open Call for Submissions, out of which seven were finally accepted after a rigorous review process.

The Special Issue commences with an overview article, authored by the Guest Editors: “Eye tracking for spatial research: Cognition, computation, challenges” [URL, PDF].


  • -

Controllability matters: The user experience of adaptive maps

An article titled “Controllability matters: The user experience of adaptive maps” will appear in one of the next issues of the Geoinformatica journal. It is now available online:

Controllability matters: The user experience of adaptive maps

Abstract Adaptive map interfaces have the potential of increasing usability by providing more task dependent and personalized support. It is unclear, however, how map adaptation must be designed to avoid a loss of control, transparency, and predictability. This article investigates the user experience of adaptive map interfaces in the context of gaze-based activity recognition. In a Wizard of Oz experiment we study two adaptive map interfaces differing in the degree of controllability and compare them to a non-adaptive map interface. Adaptive interfaces were found to cause higher user experience and lower perceived cognitive workload than the non-adaptive interface. Among the adaptive interfaces, users clearly preferred the condition with higher controllability. Results from structured interviews reveal that participants dislike being interrupted in their spatial cognitive processes by a sudden adaptation of the map content. Our results suggest that adaptive map interfaces should provide their users with control at what time an adaptation will be performed.


  • -

Full paper accepted at SUI 2016

David Rudi, Ioannis Giannopoulos, Peter Kiefer, Christian Peier, Martin Raubal (2016) Interacting with Maps on Optical Head-Mounted Displays. In Proceedings of the 4th ACM Symposium on Spatial User Interaction (SUI 2016). ACM, 2016


  • -

Paper accepted at PETMEI Workshop

Vasileios-Athanasios Anagnostopoulos and Peter Kiefer (2016). Towards gaze-based interaction with urban outdoor spaces.  In 6th International Workshop on Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2016), UbiComp’16 Adjunct, New York, NY, USA, ACM. accepted


  • -

  • -

Full paper accepted at GIScience 2016

Peter Kiefer, Ioannis Giannopoulos, Andrew Duchowski, Martin Raubal (2016) Measuring cognitive load for map tasks through pupil diameter. In Proceedings of the Ninth International Conference on Geographic Information Science (GIScience 2016). Springer


  • -

Full paper accepted at ETRA 2016

Andrew T. Duchowski, Sophie Jörg, Tyler N. Allen, Ioannis Giannopoulos, Krzysztof Krejtz (2016). Eye Movement Synthesis. In Proceedings of the 2016 symposium on Eye tracking research & applications (ETRA ’16). ACM


  • 0

Full paper accepted at ETRA 2016

Verena Schnitzler, Ioannis Giannopoulos, Christoph Hölscher and Iva Barisic (2016)The Interplay of Pedestrian Navigation, Wayfinding Devices, and Environmental Features in Indoor Settings. In Proceedings of the 2016 symposium on Eye tracking research & applications (ETRA ’16). ACM


  • -

PETMEI 2015: Summary

The PETMEI 2015 workshop at UbiComp, which Peter Kiefer has co-organized, took place on September, 7 in Osaka (Japan). There were 6 presentations, a keynote by Ali Borji, a demo, and a group work session, all with very active participation and interesting discussions. The workshop ended with a workshop dinner in a restaurant with food from the Okinawa region, and some participants continued to a Japanese Karaoke bar.

All in all, it has been a stimulating, fascinating and enjoyable event. Thanks to all participants, co-organizers, and sponsors!

Links:


  • -

Short Paper accepted at Smarttention Workshop

Peter Kiefer, and Ioannis Giannopoulos (2015). A Framework for Attention-Based Implicit Interaction on Mobile Screens.  In Proceedings of the Workshop Smarttention, Intelligent Attention Management on Mobile Devices, in conjunction with MobileHCI 2015. ACM, New York, NY, USA (accepted)


  • -

Full Paper accepted at MobileHCI 2015

Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal (2015). GazeNav: Gaze-Based Pedestrian Navigation.  In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices & Services. ACM, New York, NY, USA.

GazeNav Talk

Leading Mobile HCI researchers from all over the world meet in Copenhagen to present innovative research and gadgets. Our research group is present with 4 contributions. Read More

 


  • -

Full paper accepted at COSIT 2015

Kiefer, P., Scheider, S., Giannopoulos, I., and Weiser, P. (2015). A wayfinding grammar based on reference system transformations. In S.I. Fabrikant, M. Raubal, M. Bertolotto, C. Davies, S. Freundschuh, and S. Bell (Eds.), Spatial Information Theory (COSIT 2015), volume 9368 of Lecture Notes in Computer Science, pages 447-467. Springer International Publishing

[Download as PDF]


  • -

Department Annual Report 2014

A summary of our research on “Gaze-Based Geographic Human Computer Interaction” (PDF) is included as a research highlight in the annual report 2014 of our department (D-BAUG, Department of Civil, Environmental and Geomatic Engineering).

annualreport2014


  • -

Book chapter on gaze-based interaction for GIS (German)

Kiefer, P. (2015). Blickbasierte Mensch-Computer-Interaktion mit Geoinformationssystemen. In Thomas H. Kolbe, Ralf Bill, and Andreas Donaubauer, editors, Geoinformationssysteme 2015. Wichmann, Heidelberg.

[PDF]


  • 0

Paper accepted at CHI Workshop 2015

Ioannis Giannopoulos, Peter Kiefer, and Martin Raubal. Watch What I Am Looking At! Eye Gaze and Head-Mounted Displays. In Mobile Collocated Interactions: From Smartphones to Wearables, Workshop at CHI 2015, Seoul, Korea, 2015.

[PDF]

 

etglassold


  • -

Special Issue: Call for Submissions

We are planning a Special Issue on “Eye Tracking for Spatial Research” in Spatial Cognition and Computation: Call for Submissions (PDF).

Submission Deadline is May 27, 2015.

scc