Beyond Facial Vision: Information in Echoes for the Perceptual Guidance of Locomotion

Barry Hughes

Department of Psychology, University of Auckland, Auckland, New Zealand


Structured light is not the only form of sensory stimulation that can contribute to accurate perceptual descriptions of three-dimensional (3D) spatial layouts. That structured sound may be similarly (if not equally) informative regarding object identities, locations, sizes, and orientations in 3D space has been known since the phenomenological reports of "facial vision" were shown to be based on the exploitation of information in echoic stimulation. Instances of functionally equivalent perceptual descriptions arising via different sensory events encourages consideration of the nature of stimulation and information for perceptual-motor control.

The potential of complex echoic stimulation to inform actors about 3D spatial layouts was assessed in a series of five experiments. Naive blind (folded) participants, who were never given visual access to the size or layout of a large room, were asked to use a head-mounted sonar system (Kaspa, from SonicVision, Auckland) to approach, explore and then estimate the passibility of apertures between wall panels. The panels were separated by gaps as small as 0.05 m and as large as 1.05 m, they were aligned in depth in various ways, and approaches to the apertures were made from orthogonal and oblique angles, and estimates were made from fixed and variable locations. In all experiments, participants gave evidence of an immediate ability to use the information-in-sonar to make reliable and accurate judgments, although wall distances, wall alignment and approach angles had significant effects on passibility estimates, accuracy and judgment confidence.

In addition to these data, I present spectrographic representations of the echoes during approach and exploration with the intent of identifying the nature of the information that is, and is not, immediately exploited. Implications of the data for theoretical treatments of transmodal perception and perceptual learning, as well as for the potential role of sonar for locomotor control and navigation in the blind, are discussed.