Two new papers at ASSETS and SUI 2025Published on 17.11.2025
Yong-Joon Thoo presents two new research papers on co-located shared AR for low-vision rehabilitation
PhD candidate Yong-Joon Thoo, from the Human-IST Institute, recently presented two new publications at major ACM conferences in accessibility (ASSETS’25) and spatial interaction (SUI’25). The research investigates how co-located shared augmented reality—where both the low-vision client and therapist wear AR headsets—can enhance rehabilitation through spatially grounded, interactive training tasks and improved communication of visual strategies.
"Exploring Shared Augmented Reality for Low-Vision Training of Activities of Daily Living"
Authors: Yong-Joon Thoo, Karim Aebischer, Nicolas Ruffieux, Denis Lalanne
Link to paper: https://doi.org/10.1145/3663547.3746371
Presented at ASSETS’25 in Denver, this paper introduces the first prototype of a shared AR platform developed on the Microsoft HoloLens 2 to support low-vision training. Working closely with two certified low-vision therapists (LVTs), the team co-designed interactive tasks inspired by real rehabilitation practices, focusing on visual search and activities of daily living (ADLs).
After two feasibility studies and an exploratory evaluation with eight low-vision participants, the paper reports that:
- participants found the tasks engaging and enjoyable,
- accessibility needs varied widely, highlighting the importance of adaptable task design,
- LVTs recognised the system’s potential for real-time observation and as a motivating complement to existing training methods.
"Enhancing Therapist-Guided Low-Vision Training with Projected Gaze Behaviors in Co-Located Shared AR"
Authors: Yong-Joon Thoo, Karim Aebischer, Nicolas Ruffieux, Denis Lalanne
Link to paper: https://doi.org/10.1145/3694907.3765924
Presented at SUI’25, the second paper extends the platform by introducing projected gaze behaviors—eye-gaze, head-gaze, eye-movement traces, and a real-time field-of-view indicator—visualized directly within the shared AR environment.
Based on field observations and collaborations with LVTs, the team designed an AR visual search task and conducted an exploratory study with nine low-vision clients and participating therapists. Findings show that projected gaze:
- helps therapists interpret clients’ visual strategies more clearly,
- supports more targeted and personalised feedback,
- encourages richer dialogue during training,
while also highlighting the need for further refinement in cue interpretability and system adaptability.
Acknowledgements
We wish to thank the low-vision therapists from the Fédération Suisse des Aveugles et Malvoyants (Swiss Federation of the Blind and Visually Impaired) in Fribourg with whom we collaborated closely throughout both papers.
Additional Links
-
Project Page: https://visar.human-ist.ch/
