Emily Stover

February 2002

Stephen F. Austin State University

Auditory Localization

Return to Class front page.

Auditory localization is the ability to recognize the locationfrom which a sound is emanating (Goldstine, 2002). There are manypractical reasons for studying auditory localization. For example,previous research states that visual cues are necessary in locating aparticular sound (Culling, 2000). However, blind people do not havethe luxury of sight to help them locate a sound. Therefore, theability to locate sound based only on auditory ability is important.It is also important to study different auditory processes. Forexample, when studying a way for a blind person to maneuver throughan environment, it is helpful to know that people can most accuratelylocate sounds that happen directly in front of them; sounds that arefar off, to the side, or behind the head are the least likely to beproperly located (Goldstein, 2002).

Three coordinate systems are utilized when attempting to locate aspecific sound. The azimuth coordinate determines if a sound islocated to the left or the right of a listener. The elevationcoordinate differentiates between sounds that are up or down relativeto the listener. Finally, the distance coordinate determines how faraway a sound is from the receiver (Goldstine, 2002). Differentaspects of the coordinate systems are also essential to soundlocalization. For example, when identifying the azimuth in a sound,three acoustic cues are used: spectral cues, interaural timedifferences (ITD), and interaural level differences (ILD) (Lorenzi,Gatehouse, & Lever, 1999). When dealing with sound localizaton,spectral cues are teh distribution of frequencies reaching teh ear.Brungart and Durlach (1999) (as seen in Shinn-Cunning, Santarelli,& Kopco, 1999) believed that as the frequency of the source onthe azimuth increases, so does the individual's ability to detectaccurately the source of the sound. Additionally, the auditory systemutilizes localizing cues to determine from where a sound isoriginating (Goldstine, 2002). These consist of monaural cues fromincoming sounds and binaural cues based on sound differences in bothears (Carlile, Hyams, & Delaney, 2001).

The cues mentioned above, as well as others, originate in theauditory cortex in the brain. When the auditory cortex becomesdamaged, people may lose their ability to hear and/or they may losethe ability to locate sounds. In another line of research whichutilizes animals, Rauschecker and Korte (1993) (as seen in Goldstien,2002), found the specific area of the brain utilized for soundlocalization. This area is called the anterior ectosylvian sulcus(AES). After suturing kittens eyes shut, the kittens developedsmaller visual areas in the AES. When a person is deprived of visionthe sound area of the AES becomes larger. This is able to be appliedto humans who are blind and rely heavily on auditory cues.

Applied research on the topic of auditory localization has beenextensively explored throughout history. One of the first to examthis topic was Lord Rayleigh (1907) (as seen in Brungart,1999), whoexamined the ITD cues relationship to sound localization inhard-of-hearing individuals. through teh creation of the duplextheory of localization, he belicved that depending on the sources ofthe frequency spectrum, different localization cues will be dominant.Other research with hard of hearing individuals has been done byViehweg and Campbell (1960) (as seen in Lorenzi, et al.). In thisresearch Viehweg and Campbell concluded when in a situation withlarge amounts of background noise a hard-of-hearing individual willmake maore errors when trying to localize a sound. Therefore, manyhard-of-hearing individuals wear hearing aids to block out somebackground noise.

Research has been used to create hearing aids and environmentsthat better suit the abilities of hearing impaired individuals.Because sound localization that takes place in front of a personrelies primarily on ITD cues when occurring at low frequencies andILD cues at high frequencies (Kistler and Wightman, 1992) (as seen inLorenzi, et al.,1999), this information can be used to devise a wayfor hard-of-hearing individuals to have more ITD cues present sinceILD are harder to hear and utilize (Lorenzi,et al., 1999). Also theability of a blind person to locate a sound and better understandtheir immediate surroundings it is imperative to have multiplebinaural clues present. For example, in a recent study, accuracy ofsound location at a close proximity was greatly increased whenbinaural cues were present (Brungart, 1999).

The ability to accurately locate a sound that is heard fromvarious distances has also been extensively studied. Sounds whichoccur over one meter from the head, are equal for any source fallingon a cone centered on the internal axis (i.e., the "cone ofconfusion") (Shinn-Cunningham, Santarelli, & Kopco, 1999). Thecone of confusion is an area in which an individual has a difficulttime locating the exact location of a sound. For example, if a personwere to hold a megaphone to their ear the cicumfrance at the end ofthe megaphone is the cone of confusion, and anything in that area istoo close together to distinguish a specific sound (Shinn-Cunningham,et al., 1999). The primary way to eliminate this error of the cone ofconfusion is to evaluate the spectral content of the signals reachingthe eardrum (Kulkarni, & Colburn, 1998) (as seen inShinn-Cunningham, et al., 1998).

Auditory localization has previously been examined in terms of therelationship between visual perception and auditory cues. Carello, etal.(1998) found that when an individual hears leaves rustling orwater dripping, the people not only can tell what they are hearingbut they also create a visual picture of what they are hearing. Thissuggests that there is a link between the two perceptual senses.However, Carello, Anderson, and Kunkler-Peck (1998) also found thatindividuals were able to scale an object accurately with no previousknowledge of the object by simply hearing the object falling to thefloor. In support of these findings McDonald, Teder-Salejarvi, andHillyard (2000) found that a sound does have an effect on theprocessing of subsequent and concurrent visual stimuli.

A great deal of current research is aiding the development ofbetter technology in sound location. As research gains more ground indetermining how different factors influence our ability to locatesound, these factors are taken into account when designing newequipment in a variety of areas.One of these areas is emergencysirens on ambulances.

Emergency vehicle sirens have been extensively studied. Thepurpose for ambulance sirens is to warn drivers of their presence andgive some direction to those drivers. However, ambulance sirens areperceived by the driver to be further away than they actually arewhen the ambulance moves toward the back of the driver (Caelli &Porter, 1980). According to this same study, siren localizationerrors are primarily due to reversals and also to the cycle of thesiren. The traditional "hee-haw" siren of 30 cycles/min repetitionrates resulted in an average localization error of 20 degrees (Caelli& Porter, 1980). This area of applied research demonstrates aneed for the perceptual part of sound localization and thetechnological part of siren making to inform each other in order tobetter help drivers locate the approaching ambulance.

Currently, ambulance siren research is broadening to police sirensand other emergency alert signals that will allow a listener todetermine where a sound is coming from and to not panic. With policesirens, drivers have been shown to delay emergency personnel bypanicking because they can not determine from where the siren isoriginating. New sirens are being developed that have breaks in thesiren cycle and are easier to locate.

An area where vision and sound localization are being combined isin the design of cockpits. Different frequencies and wavelengths arebeing utilized in the new designs. Also, colors and flashing lightsare being associated with the different sounds in the hope thatpilots will make fewer errors when flying therefore cutting down onthe number of plane crashes. Overall, sound localization is oftenoverlooked by individuals; however, it is an important area to aid inthe development of ergonomics.

 

 

References

Brungart, Douglas. (1999). Auditory localization of nearbysources. III. Stimulus effects. Journal of the Acoustical Societyof America, 106 (6), 3589-3602.

Brungart, D. & Durlach, N. (1999). Auditory localization ofnearby sources II: Localization of a broadband source in the nearfield. Journal of the Acoustical Society of America , 106 (4),1956-1968.

Caelli, T., & Porter, D. (1980). On difficulties in localizingambulance sirens. Human Factors, 22 (6), 719-724.

Carello, C., Anderson, K., & Kunkler-Peck, A. (1998).Perception of object length by sound. Psychological Science, 9(3), 211-214.

Culling, John. (2000). Auditory motion segregation: A limitedanalogy with vision. Journal of Experimental Psychology: HumanPerception and Performance, 26 (6), 760-1769.

Goldstein, E. (2002). Sensation and perception (Rev. ed.).Pacific Grove, CA: Wadsworth-Thomsom Learning.

Lorenzi, C., Gatehouse, S., & Lever, C. (1999). Soundlocalization in noise in hearing impaired listeners. Journal ofthe Acoustical Society of America, 105 (6), 3454-3463.

Lorenzi, C., Gatehouse, S., & Lever, C. (1999). Soundlocalization in noise in normal hearing listeners. Journal of theAcoustical Society of America, 105 (3), 1810-1820.

McDonald, J., Teder-Salejarvi, W, & Hillyard, S. (2000).Involuntary orienting to sound improves visual perception.Nature, 407, 906-907.

Shinn-Cunningham, B., Santarelli, S., & Kopco, N. (1999). Toriof Confusion: Binaural localization cues for sources within reach ofthe listener. Journal of the Acoustical Society of America,107 (3), 1627-1636.