Research

Balance

  • Current Research
  • Previous Research

Overview

Virtual Reality-Based Pre-Flight Astronaut 3D Navigation Training (First Award Fellowship)

Principal Investigator:
Hirofumi Aoki, Ph.D.

Organization:
Massachusetts Institute of Technology

Spatial disorientation and navigation problems have been reported by astronauts operating in spacecraft with complex three-dimensional (3D) architecture. These problems can complicate responses to emergencies. In weightlessness, astronauts face various orientations and 3D navigation, but these challenges cannot be simulated in mockups on the ground.

NSBRI Postdoctoral Fellow Dr. Hirofumi Aoki has designed a virtual reality-based training method as a countermeasure to inflight spatial disorientation and 3D navigation problems. Subjects perform navigational exercises and emergency egress tasks in a virtual, realistic simulation of the ISS under normal and smoke-obstructed conditions. The results should define procedures for preflight spatial disorientation and navigation training and design standards for future spacecraft.

NASA Taskbook Entry


Technical Summary

The goal of this sensorimotor/human factors project was to develop a virtual reality (VR) based training method for astronauts aboard the International Space Station (ISS) or on a Mars mission vehicle as a countermeasure for inflight spatial disorientation and navigation. These problems have been frequently reported by crews of Space Shuttle, Mir and ISS as complicating responses to emergencies. The three-dimensional (3D) architecture and inconsistency of the visual vertical of adjacent quarters and modules, combined with the limited visual experience of crew members is the major cause of the problem, identified as a significant risk by NASA. Astronauts normally see the interior of a spacecraft from a variety of body orientations and viewpoints that cannot be simulated on the ground. It requires cognitive skills to interrelate cues perceived in a body centered (egocentric) frame of reference built up directly through navigation and also in an overall (allocentric) frame of reference defined by the spacecraft. Astronauts can either learn this interrelationship in flight, or develop the required cognitive knowledge prior to flight via VR simulation.

We conducted a series of experiments of 3D spatial orientation and navigation performance in a virtual space station using simulated emergency egress tasks. In the first experiment in a fully-immersive virtual environment with a head-mounted display, we showed that individual 3D spatial abilities (e.g. mental rotation and perspective-taking skills), relative orientation to the environment, and the configuration of the environment influence performance. Subjects trained locally, visually upright developed landmark and route knowledge, whereas subjects who maintained a constant orientation with respect to the entire station during training had an enhanced sense of direction and 3D cognitive map, and therefore better performance in low-visibility simulated smoke condition. This result suggests that training initially should be performed locally upright, followed by training in a constant station orientation, and then trainees should be challenged by trials in randomized orientation. This could be customized based on individual spatial ability and task performance. This study, published in Aviation, Space and Environmental Medicine, was awarded the 2007 ASMA Space Medicine Branch Young Investigator Award out of 177 nominees.

In the second experiment, it was shown that most 3D navigation performance measures for this egress task were similar in the immersive and non-immersive VR systems. Subjects pointed out that this egress task was mainly "done in your head" and that vestibular cues were not critical. This finding is important, since it suggests that laptop trainers (analogous to DOUG for extravehicular activity training) could be used for preflight (or even inflight) emergency egress navigation training. Based on these results, this project intended to clarify whether VR training can help to develop cognitive skills and to learn the retention, improvement and limitations of 3D human spatial orientation and navigation for long-term training. In the experiment, we demonstrated that "see-through walls" and a miniature 3D model of the environment by VR technology features were useful. Subjects trained with those VR tools showed better performance than those without at the training day, but results were the same in both groups one month later. This result showed the effectiveness of preflight spatial orientation and navigation training, especially in the early stage of learning. Taken together, these studies provide solid laboratory validation for a preflight VR-based navigation training countermeasure at Countermeasure Readiness Level (CRL) 7. The next step is CRL 8 validation with human subjects in spaceflight to demonstrate operational feasibility and efficacy.


Earth Applications

Results of this project help develop crew safety by understanding 3D spatial orientation, navigation, spatial memory, and by establishing training methods and providing implications for future spacecraft design including Orion, Lunar Lander and Mars Transit Habitat. By gaining a better cognitive map of the environment, motion sickness and visual reorientation illusions could be reduced. The simulation tool could be used to train other professions such as firefighters and submariners, as well as occupants of high-story buildings. Results also support deep understanding in humans from the viewpoint of brain and cognitive science. Our results also pertain to environmental and architectural design and pre/post-occupancy evaluation of buildings, undergrounds and cities.


This project's funding ended in 2007