BACKGROUND: Researchers from the Lions Vision Center at the Wilmer Eye Institute at Johns Hopkins University used a "virtual forest" to identify study participants as either good or poor navigators. The results suggest that poor navigators rely on visual information to solve the task, while good navigators are able to use visual information together with a mental picture of the environment.
HOW IT WORKS: By simulating the loss of peripheral vision during navigation, the researchers were able to create a way to control the amount of external visual information available to participants. This means they could directly test how much the participants relied on this type of information to learn about their environments. Knowing what types of information individuals use when navigating, and how performance gets worse when that information is removed, can not only help us understand human navigation in general, but also lead to the development of rehabilitation protocols for people with impaired vision.
ABOUT PERIPHERAL VISION: Peripheral vision refers to what we can see out of the corners of our eyes. The retina contains light-sensitive cells called rods and cones. The cones sense color and are found mostly in the central region of the retina. When you see something out of the corner of your eye, the image focuses on the periphery of the retina, where there are very few cones, so it's difficult to distinguish the colors of objects. Rods also become less densely packed toward the outer edges of the retina, reducing your ability to resolve the shapes of objects at the periphery. But our peripheral vision is highly sensitive to motion, probably because it was a useful adaptation to spot potential predators in the earlier stages of human evolution.
WHAT IS VIRTUAL REALITY: The term "virtual reality" is often used to describe interactive software programs in which the user responds to visual and hearing cues as he or she navigates a 3D environment on a graphics monitor. But originally, it referred to total virtual environments, in which the user would be immersed in an artificial, three-dimensional computer-generated world, involving not just sight and sound, but touch as well. Devices that simulate the touch experience are called haptic devices. Touch is vital to direct and guide human movement, and the use of haptics in virtual environments simulates how objects and actions feel to the user. The user has a variety of input devices to navigate that world and interact with virtual objects, all of which must be linked together with the rest of the system to produce a fully immersive experience.
The Human Factors and Ergonomics Society contributed to the information contained in the TV portion of this report.