Sound Localization and New Applications of itsResearch

Jonathan Aston

A research paper for PSY 440 (Perception)
Stephen F. Austin State University

Imagine you're standing in the mall and your friend calls out toyou from behind. You know your friend is behind you, even withoutlooking around because you can tell from where the voice came. Soundlocalization is a relatively simple idea. It is the ability to findfrom where a sound is coming. Sound localization can help huntersknow where their prey is hiding, or it can let someone know withoutlooking that a cat is up in the tree rather than behind it. It is avery important aspect of our lives. There has been much research doneon sound localization and there will be much more in the future.

Sound localization places sounds into three different coordinatesystems. The Azimuth is the horizontal coordinate; it refers tosounds that are located to the left or the right compared to thelistener. The ear uses interaural differences to find the azimuth ofa sound (Goldstein, 2002). Interaural differences are the disparitiesbetween the sounds reaching each ear. The ITD, or Interaural TimeDifference, is the difference in the amount of time it takes for asound wave to hit one ear relative to the other ear. The ILD, orInteraural Level Difference, is the difference of sound levelsbetween the two ears. These Interaural differences help the earsdetermine if a sound is coming from the left or the right of theperson (Goldstein, 2002).

The next coordinate is the Elevation coordinate. In contrast, theAzimuth is only equipped to tell if a sound is coming from left orright of the ears. The Elevation coordinate system can pinpoint ifthe sound is coming from above or below the ears. The Elevationsystem uses Spectral cues (Goldstein,2002). These cues come from theway the head and pinnae of the ear affect a sound's frequency. TheDTF, or Directional Transfer Function, is the difference between theactual sound and the sound that enters the ear after bouncing aroundthe pinnae first. The smoother the pinnae are, the harder it is tolocate a sound. Research has shown that if a person's pinnae arealtered, it changes their auditory perceptions (Van Hoesel, 2003).The ear can tell the location of a sound above or below the ear bymeasuring the differences in sound frequency that are caused bybouncing around the pinnae.

The third and final coordinate system is used to measure distance.There has not been much research done on this system. Soundfrequency, movement parallax, reflection, and sound level are four ofthe distance cues used by the ear. Sounds change over a distancebecause the atmosphere tends to absorb the higher frequencies.Movement parallax is like what the visual system uses to find depth,in that sounds that are closer to the ear move across our "field ofview" quicker than those that are far away. Reflectance occurs whenthe ear receives an indirect sound that has bounced off surfaces likewalls or the ground. Sound level is the indication of distance basedon the pressure the ear receives from a sound.

There are times when sound localization can break down. Obviouslyif there is damage to the ear that causes deafness, it will beimpossible to locate sounds. If the auditory cortex is injured orremoved completely the ability to find sounds will decreasedramatically as well. People who are hard-of-hearing often needhearing aids to be able to perceive sound properly. If one of theseindividuals wanted to save money and only bought one hearing aid,sound localization would not work properly. Similarly depthperception in the visual system requires two eyes. Sound localizationis not nearly as effective when using only one ear, because it doesnot have another to compare the information. If only one ear couldfunction properly, it would make the ITD and ILD cues useless. Theprocess animals use to locate sound is impressive but notperfect.

Sound localization processing takes place mostly in the auditorycortex of the brain. Research has shown that if this area of thebrain is damaged, then the person or animal has a hard time findingfrom where a sound came (Klingon &Bontecou, 1966). Researchershave also found several different neurons that may indicate soundlocations. These neurons are called Interaural Time DifferenceDetectors. They respond to the amount of the ITD delay. Each neuronis tuned to delays of a certain amount of time. These ITD detectorneurons locate sound but they are not very good at showing the exactlocation. Research has pointed out that one of the possible ways wecan pinpoint specific sound locations is due to the timing patternsof neurons in the auditory cortex (Middlebrooks, Xu, Eddins, &Green, 1998). These neurons are called panoramic neurons. Theresearch theorizes that the panoramic neurons use a different firingpattern depending on where the sound is located.

The ability to locate sounds is currently the subject of muchresearch. There are many practical applications for this type ofresearch. Helping perceptually handicapped people is one of the mainreasons researchers are interested in this field. Both blind andhard-of-hearing people could benefit from breakthroughs in this fieldof research.

Blind individuals often find it hard to maneuver in anenvironment. They find it especially difficult if they are unfamiliarwith their surroundings. Currently there is research by Loomis intoan electronic "Personal Guidance System" that could help blind peopleget around (Loomis, Golledge, Klatzky, Speigle, & Tietz. 1994).The system includes a computer, an electronic compass, headphones, areceiver, and a transmitter. This equipment then communicates withsatellites in orbit. These satellites can then determine the person'slocation based on the person's transmitter. With the coordinates fromthe transmitter, the system will know where the person is by using aGPS, or Global Positioning Satellite, to pinpoint the person towithin one meter. Maps of areas will also be programmed into thecomputer. These maps will contain information about what objects andfeatures can be found in these areas. Examples of this would includelight poles, buildings, bus stops, and phone booths. Then Loomisprogrammed the computer to basically tell the person where theseobjects are located. Using binaural cues, the computer will relay thewords to the person as if the object itself were saying them (Loomiset al.1994). For example, suppose the blind person is coming upon abus stop, the computer will identify the object based on its locationand say the words "bus stop" through the headphones. The computerwill vary the timing and sound level of the words transmitted to eachear. Due to these differing cues, the brain will interpret the wordsas if they were coming from the bus stop's actual location. Becausethe person cannot see, the computer will make use of one of theirsensory systems that does still work. Sound localization is extremelyhelpful to blind people in their way of life. This system willbasically allow them to use their sense of hearing instead of sightto avoid objects while traveling. Using what researchers have learnedabout sound localization they will be able to help blind people movearound much more effectively.

The downside to this "Personal Guidance System" project is that itcannot identify objects that are not in the map. Things like people,cars, and things lying around on the sidewalk will not be picked upby the computer. The current apparatus is also way too big and clunkyto be walking around with as well. The apparatus is not available toanyone yet because there is still a lot of work that must be done.Even though this system will not be ready for some years, it is stilla great step forward for helping blind people. In my opinion this"guidance" system is very worthwhile and deserves future researchinto practical applications.

Knowing how sound localization works will help people who buildplaces like music halls, opera houses, movie theatres because theywill know where to place sound reflective surfaces. Research has beendone on the effect sound reflecting surfaces have on soundlocalization. Guski, (1990), had subjects put in an anechoic chamberwith twenty-seven speakers placed at different locations around theroom. Guski's subjects were then instructed to locate the correctspeaker that produced the sound. In each trial a surface thatreflects sound was placed along a wall, the floor, or the ceiling. Itwas found that if the reflecting surface was on the ceiling, thesubjects could not locate the sound as effectively. While if it wereon the floor the subjects did significantly better at localizing.With this kind of information architects could design buildingsaccordingly, in order to get the desired sound reflectance.

There will be research on sound localization for many years tocome. Benefits of this research will reach many aspects of life. Themore that researchers find out about the process of soundlocalization, the more they can work around auditory handicaps.Better hearing aids will be a result of this sort of research.Perhaps one day, those with hearing impairments will be able to hearas well as people who do not.

Research into sound localization will be helpful in the realm ofcomputer technology as well. Computer programmers will be able tocreate better virtual realities with proper sound localization(Banos, 1999). In order for a virtual reality to seem real, all thevisual and auditory cues must be accurate according to the senses. Ifresearchers can pinpoint exactly how the localization process worksand all of the necessary cues, they can be used to increase therealism of a VR world. These virtual environments could be used forthings like video games, learning programs, or even military trainingexercises. A fantastic yet possible distant future use for soundlocalization research could be like the Holodecks in the Star Trektelevision shows. Research shows that virtual sounds can be made tosound real (Loomis, Herbert, Cicinelli, 1997). Though the soundsmight appear real, in a holographic world it would not seem very realif the voices of the characters in the program did not sound as ifthey were coming from the characters themselves. If computerprogrammers could figure out how to make the brain "locate" soundswherever they wanted, it would be a major breakthrough.

Sound localization is something that people generally take forgranted. We use it on a daily basis, without even realizing it.Scientists have a reasonable idea on how the process works, but itwill take more time to understand it fully. The current researchbeing done on this area will be most beneficial, especially tohandicapped individuals. The area of sound localization an importantarea of perception and will be studied for many years.

References

1. Banos, R.M., CyberPsychology & Behavior, Vol 2(2), Apr1999. pp. 143-148.

2. Goldstein, E. (2002). Sensation and perception (Rev. ed.).Pacific Grove, CA: Wadsworth-Thomsom Learning.

3. Klingon, G.H., & Bontecou, D.C. (1966). Localization inauditory space. Neurology, 16, 879-886.

4. Loomis, J.M., Golledge, R. G., Klatzky, R.L., Speigle, J.M.,& Tietz, J. (1994). Personal guidance system for the visuallyimpaired. Marina del Rey, CA.

5. Loomis, J. M.; Hebert, Cicinelli, G. (1997). Journal of theAcoustical Society of America. 1990 Oct Vol 88(4) 1757-1764.

6. Middlebrooks, J.C., Xu, L., Eddins,C., & Green, D.M.(1998). Codes for sound-source location in nontonotopic auditorycortex. Journal of Neurophysiology, 80, 863-881.

7. Van Hoesel, J. M. (2003). Speech perception, localization, andlateralization with bilateral cochlear implants. Journal of theAcoustical Society of America, Vol 113(3), Mar 2003. pp.1617-1630.


Return to Class frontpage.