Ph.D. Student at Hokkaido University
Yuki Abe, Keisuke Matsushima, Kotaro Hara, Daisuke Sakamoto, and Tetsuo Ono
RunSight, an AR-based assistive tool to support low-vision (LV) people to run during nighttime. (A) An LV runner using RunSight for Caller-style guided running. The tool enables LV individuals to run at night who would otherwise be unable to run. (B) RunSight provides a see-through scene with an edge-enhanced view of the environment along with a visualization of the guide’s trace. This supports LV individuals to be aware of potential hazards, follow a guide, and run safely at night.
Dark environment challenges low-vision (LV) individuals to engage in running by following sighted guide—a Caller-style guided running—due to insufficient illumination, because it prevents them from using their residual vision to follow the guide and be aware about their environment. We design, develop, and evaluate RunSight, an augmented reality (AR)-based assistive tool to support LV individuals to run at night. RunSight combines see-through HMD and image processing to enhance one’s visual awareness of the surrounding environment (e.g., potential hazard) and visualize the guide’s position with AR-based visualization. To demonstrate RunSight’s efficacy, we conducted a user study with 8 LV runners. The results showed that all participants could run at least 1km (mean = 3.44 km) using RunSight, while none could engage in Caller-style guided running without it. Our participants could run safely because they effectively synthesized RunSight-provided cues and information gained from runner-guide communication.
Teaser (1 min)
Presentation (10 min)
Yuki Abe, Keisuke Matsushima, Kotaro Hara, Daisuke Sakamoto, and Tetsuo Ono. “I can run at night!”: Using Augmented Reality to Support Nighttime Guided Running for Low-vision Runners. In CHI Conference on Human Factors in Computing Systems (CHI ’25), April 26–May 01, 2025, Yokohama, Japan. ACM, New York, NY, USA, 20 pages. Honorable Mention recognition. [DOI]