We have integrated the treadmill-style locomotion interface, called the unconstrained walking plane (UWP), with virtual environment (VE) to enable non-visual spatial learning (NSL). This setting allows for a new type of experience, whereby participants with visual disability can explore VE for NSL and to develop cognitive maps of it. Although audio and haptic interface has been studied for NSL, nothing is known about the use of locomotion interface for supporting NSL. We report an experiment that investigates the efficacy of UWP for NSL, formation of cognitive maps, and thereby enhancing the mobility skill of visual impaired people (VIP). Two groups of participants – blind-folded sighted, and blind – learned spatial layout in VE. They used two exploration modes: guided (training phase) and unguided (testing phase). In unguided exploration mode, spatial layout knowledge was assessed by asking participants to perform object localization task and target-object task. Results reveal that the participants have benefited by the learning, i.e. there were significant improvements in post-training navigation performance of the participants.
©CIS Journal. Article first published in ‘Journal of Emerging Trends in Computing and Information Sciences’ V. 1 (2010), n. 1 Reprinted with permission