• eyetracking@ethz.ch
  • +41 44 633 71 59

Author Archives: Tianyi Xiao

  • -

geoGAZElab recognized best paper honorable mention award at CHI2026 conference

Waitlist - CHI 2026.Our paper has earned the Best Paper Honorable Mention designations at the 2026 ACM CHI Conference on Human Factors in Computing Systems. ACM CHI is the premier international conference of Human-Computer Interaction (HCI). CHI 2026 received 6,730 complete submissions, making this the largest number of submissions in the history of CHI, and accepted 1,703 papers (25.3% of complete submissions). Honorable mentions are awarded to the top 5% of submitted papers.

๐—–๐—ผ๐— ๐—ฎ๐—ฝ: ๐—” ๐—–๐—ผ๐—น๐—น๐—ฎ๐—ฏ๐—ผ๐—ฟ๐—ฎ๐˜๐—ถ๐˜ƒ๐—ฒ ๐Ÿฏ๐—— ๐—ฆ๐—ธ๐—ฒ๐˜๐—ฐ๐—ต ๐— ๐—ฎ๐—ฝ๐—ฝ๐—ถ๐—ป๐—ด ๐—š๐—ฎ๐—บ๐—ฒ ๐—ณ๐—ผ๐—ฟ ๐—ฆ๐—ฒ๐—ฎ๐—ฟ๐—ฐ๐—ต ๐—ฎ๐—ป๐—ฑ ๐—ฅ๐—ฒ๐˜€๐—ฐ๐˜‚๐—ฒ ๐Ÿ“Ž

Authors: Tianyi Xiao, Sailin Zhong, Peter Kiefer, Miki Mizuki, Phoebe O. Toups Dugas, and Martin Raubal.

Paper maps support (a) field responders’ route planning and (b) commanders’ tactical coordination. Photos from the 2023 Turkey earthquake, consented by a commander during expert interviews

Abstract

Search and rescue operations rely on fast and accurate spatial communication between commanders and field teams, often under severe time pressure and with asymmetric information. While maps are central to this process, traditional paper-based sketching struggles with 3D environments and remote collaboration. We present CoMap, a collaborative 3D sketch mapping system, validated through a virtual reality fire-rescue game. In a controlled study with 13 commanderโ€“field team pairs, CoMap enabled more accurate and efficient spatial communication than conventional 2D sketch mapping and fostered more proactive communication strategies. The paper also distills three design implications for future mapping tools to support SAR training and real-world operations.

Teaser

VR search and rescue training game in a fire scenario with a collaborative 3D sketch mapping interface, CoMap. This figure illustrates a virtual reality (VR) search and rescue training game set in a fire scenario. It shows two types of participants: a Commander, who has an allocentric view through drone and base maps, and a Field Team, who navigate the scene from an egocentric perspective, identifying hazards and reporting findings. The image also highlights CoMap, a shared 3D sketch mapping interface that enables collaborative map co-maintenance, supporting collective spatial cognition for distributed teams.

Interface Design

Core interfaces (a-c) and functions (d-f) of CoMap.
(a) Interface of the management tool used for UOS server control, data logging, and global data synchronization.
(b) The perspective from a Field Responder navigating the virtual scene using arc raycasting.
(c) Key components of the Commander’s workspace, including a layered system used for 3D map representation of a multi-level building, and a video player.
(d) Multimodal communication through photo and video capture from the Field Responderโ€™s perspective.
(e) Surface sketch mapping, where sketched curves are projected onto layers.
(f) Interaction process for relocating a pin by dragging it from a corner to the desired position.

Architecture

CoMap operates on a client-server architecture for data synchronization and voice communication. A management tool monitors the server and global data. Externalization tools include sketch mapping, pin placement, notes, photos, and videos. After each SAR mission, the externalization process is uploaded to a Git-based remote repository for data logging and reconstruction.

Funding

This work is part of the 3D Sketch Maps, a project funded by SNSF,ย Sinergia. Project page: https://geogaze.ethz.ch/3d-sketch-maps/comap/


  • -

CHIโ€˜26 paper accepted!

geoGazeLab has one paper accepted at ACM CHI, the leading international conference on Humanโ€“Computer Interaction.

๐Ÿ“„ ๐—–๐—ผ๐— ๐—ฎ๐—ฝ: ๐—” ๐—–๐—ผ๐—น๐—น๐—ฎ๐—ฏ๐—ผ๐—ฟ๐—ฎ๐˜๐—ถ๐˜ƒ๐—ฒ ๐Ÿฏ๐—— ๐—ฆ๐—ธ๐—ฒ๐˜๐—ฐ๐—ต ๐— ๐—ฎ๐—ฝ๐—ฝ๐—ถ๐—ป๐—ด ๐—š๐—ฎ๐—บ๐—ฒ ๐—ณ๐—ผ๐—ฟ ๐—ฆ๐—ฒ๐—ฎ๐—ฟ๐—ฐ๐—ต ๐—ฎ๐—ป๐—ฑ ๐—ฅ๐—ฒ๐˜€๐—ฐ๐˜‚๐—ฒ

Authors: Tianyi Xiao, Sailin Zhong, Peter Kiefer, Miki Mizuki, Phoebe O. Toups Dugas, and Martin Raubal.

๐Ÿšจ What is it about?

Search and rescue operations rely on fast and accurate spatial communication between commanders and field teams, often under severe time pressure and with asymmetric information. While maps are central to this process, traditional paper-based sketching struggles with 3D environments and remote collaboration.

We present CoMap, a collaborative 3D sketch mapping system, validated through a virtual reality fire-rescue game. In a controlled study with 13 commanderโ€“field team pairs, CoMap enabled more accurate and efficient spatial communication than conventional 2D sketch mapping and fostered more proactive communication strategies.

The paper also distills three design implications for future mapping tools to support SAR training and real-world operations.

๐Ÿ”— More information:

This work is part of the 3D Sketch Maps, a project funded by SNSF,ย Sinergia. Project page: https://geogaze.ethz.ch/3d-sketch-maps/comap/


  • -

Design++ XR in AEC Summer School 2025

As a member of geoGAZElab, Tianyi Xiao participated in organizing the Design++ XR in AEC Summer School 2025 and served as a workshop mentor.

Titled โ€œEnhancing Spatial Communication through 3D Sketch Mapping in Extended Reality,โ€ the workshop content is based on our project funded by the Swiss National Science Foundation: 3D Sketch Maps. The workshop aims to impart foundational theoretical knowledge on spatial cognition to students, introduce research findings on 3D Sketch Mapping, teach basic empirical research methods, and conduct user studies using the 3D Sketch Mapping tool developed within the project.

The timetable for the workshop is as follows:

 

 


  • -

CHI 2025: two papers accepted

We are pleased to share that two of our papers have been accepted for the ACM (Association of Computing Machinery) CHI conference on Human Factors in Computing Systemsย  2025, the most important international conference in the field of human-computer interaction. CHI 2025 took place in Yokohama, Japan, from 26 April to 1 May 2025.

Adrian Sarbach presented his paper with the title “Next-Generation Navigation: Evaluating the Impact of Augmented Reality on Situation Awareness in General Aviation Cockpits“, written together with Thierry Weber. This paper presents findings on how AR technology (tested with the Microsoft HoloLens 2) can help pilots obtain and keep a higher level of situation awareness in the aircraft cockpit.

Tianyi Xiaoย presented his paper with the title “Sketch2Terrain: AI-Driven Real-Time Terrain Sketch Mapping in Augmented Reality“, written together with Yizi Chen, Sailin Zhong, Peter Kiefer, Jakub Krukar, Kevin Gonyop Kim, Lorenz Hurni, Angela Schwering, and Martin Raubal. This paper developed a generative AI model that can help novice mappers externalize complex terrain memory in XR.

Please refer to our paper if you are interested in aviation and 3D sketch mapping in VR/AR technologies!

———-

Here again papers and their respective abstracts


Next-Generation Navigation: Evaluating the Impact of Augmented Reality on Situation Awareness in General Aviation Cockpits

Flights in general aviation require pilots to navigate using 2D maps, which splits their attention between the cockpit and the outside environment, reducing situation awareness. Augmented reality (AR) can bridge the gap between the inside and outside world, and thus can resolve the issue of attention switches. In a mixed methods simulator study with 19 pilots, we tested an AR application that integrated invisible and hard-to-see aeronautical data and navigation features with the visible world. Results show that the AR tool enhances and accelerates orientation, and can result in flight trajectories being more accurate with AR than without AR. Situation awareness, measured with a subjective self-rating, was not increased with AR support. Participants voiced concerns about AR content occluding outside features, while positive feedback included use cases in unfamiliar areas and in low visibility, as well as highlighting of hazards.


Sketch2Terrain: AI-Driven Real-Time Terrain Sketch Mapping in Augmented Reality

Dataset, Open-sourced project page

Sketch mapping is an effective technique to externalize and communicate spatial information. However, it has been limited to 2D mediums, making it difficult to represent 3D information, particularly for terrains with elevation changes. We present Sketch2Terrain, an intuitive generative-3D-sketch-mapping system combining freehand sketching with generative Artificial Intelligence that radically changes sketch map creation and representation using Augmented Reality. Sketch2Terrain empowers non-experts to create unambiguous sketch maps of natural environments and provides a homogeneous interface for researchers to collect data and conduct experiments. A between-subject study (N=36) revealed that generative-3D-sketch-mapping improved efficiency byย 38.4%, terrain-topology accuracy byย 12.5%, and landmark accuracy by up toย 12.1%, with only aย 4.7%ย trade-off in terrain-elevation accuracy compared to freehand 3D-sketch-mapping. Additionally, generative-3D-sketch-mapping reduced perceived strain byย 60.5%ย and stress byย 39.5%ย over 2D-sketch-mapping. These findings underscore potential applications of generative-3D-sketch-mapping for in-depth understanding and accurate representation of vertically complex environments. Theย implementationย is publicly available.


  • -

New publication in the International Journal of Human-Computer Studies

The 3D Sketch Maps project team recently published a paper entitled:ย  VResin: Externalizing spatial memory into 3D sketch map. In this paper, we refined the concept of 3D sketch maps, proposed a technological framework using 3D Cartesian coordinate axes, and provided design criteria for 3D sketch mapping interfaces. We implemented a VR-based 3D sketch mapping tool called VResin, which is associated with traditional resin painting, and a layer-by-layer sketching interface that considers both researcher and user needs. We then conducted a comparative user study with 48 participants between mid-air sketching and layer-by-layer sketching interfaces for memorising multi-layered buildings. We found that VResin helps users to create less distorted sketches while maintaining the level of completeness and generalisation compared to mid-air sketching in VR. Finally, we presented application scenarios demonstrating how 3D sketch maps can support people’s externalisation of their 3D spatial understanding. This study is part of a Sinergia project called โ€œ3D Sketch Mapsโ€, funded by theย Swiss National Science Foundation (SNSF)ย [grant numberย 202284].

The interface design of VResin