About the SA Blog Network

The Scicurious Brain

The Scicurious Brain

The Good, Bad, and Weird in Physiology and Neuroscience
The Scicurious Brain Home

How does your brain hear distance?

The views expressed are those of the author and are not necessarily those of Scientific American.

Email   PrintPrint

Your cellphone rings. You don’t remember where you left it, but you can hear the ring and you know it’s in the next room. The question is, how did you know?

Our eyes can tell distance by comparing the visual results from the left and right eyes (going blind in one eye will impair your ability to judge visual distance). But what about your ears? The ears have an extra problem, they could be easily fooled by the intensity of a sound, and confuse it for distance. A soft sound and a sound hear from very far away have the same intensity, the same decibel level…and yet we know that one is close and one is far. How do we know? You might say, as with our eyes, we could compare the left and right ear (and this is what I learned in school), but in fact, we can’t rely on that alone, because unless the sound is coming from directly in front of us, the rate of change in the intensity of a noise between your two ears is constantly different. You’re always at a slightly different angle relative to the sound, or the air is different, making the comparison between left and right ears that much harder, as it required constant adjustment. There has to be another way.

Kopco et al. “Neuronal representations of distance in human auditory cortex” PNAS, 2012.

We know that when a loud sound is played, our auditory neurons will respond, but they respond in a pretty haphazard fashion. In contrast, for lower decibel levels, we can distinguish sounds of the same intensity at different distances (the best distance is up to 100cm). The question is how we can tell the difference.

To look at this question, the authors of this study took 12 people, and exposed them to sound stimuli at different distances.

What you can see above is the layout of the experiment. The participant is in an fMRI machine, and getting scanned to look at changes in blood oxygen signaling in the brain. During the scan, the person is receive pairs of tones, at different distances, in the right ear. They then have to pick which tone is further away. In this task there are several different options. The sounds could be of different intensities (decibel level), they could be of different distances away, or the could be constant.

What the authors were looking for was two things: first to see whether the participants could distinguish the different distances, and second, to see where in the brain the distinguishing of distance might be taking place.

First, you can see above that the participants were very good at distinguishing the different in distance, with 70% correct responses for even the noises with the sources closest together. But what about where the signals were being processed?

What you can see here are the fMRI signals during the sound testing. The top pair is a constant pair of tones, at constant intensity and constant distance, compared to a condition where the person was listening for a sound, but only received quiet. Unsurprisingly, the areas of activation are focused on the temporal lobe (though most likely there was a lot of activation elsewhere as well, this is just where they were looking), where there is a lot of auditory processing. You can see the signal is stronger when the person is actually hearing something, rather than just expecting to hear something.

The second group of signals down shows varying sound intensity (decibel level, or heck, “loudness”) compared to the control condition, and the third group compares varying sound distance vs the control condition. But the final group is the most interesting, where they compare the intensity vs the distance, and find a difference in processing in a very small part of the temporal lobe (though I would like to see how the subtraction compared for the control conditions as well). And the authors hypothesize that this area, which is also near other auditory pathways known to process things like the direction of sound, might process the distance as well.

What they were relying on to reveal this result is stimulus-specific adaptation. Basically, your brain will show reduced responses to something that is repetitive. A behavioral correlate of this would be, say, entering a room with a white noise generator. At first you notice it, but after a while it recedes into the background and you cease to notice it. Another example would be how most people cease to feel their clothing as it rubs on them during the day (though I bet you’re thinking about it now!). These are stimulus-specific adaptations. They occur easily to things like sound, and the brain will also respond less to such repetitive things. So the authors were hoping that the stimulus specific adaptation to things like the constant condition of noise would allow the delicate differences in distance and intensity to come through.

Keep in mind, though, that this is an fMRI study, and comes with all the usual caveats of fMRI studies: we don’t know what we’re looking at. It’s a small population of people, and the fMRI signal could mean almost nothing (you can, after all, get fMRI signal off a dead salmon). The area the found is very small, and with such a small sample could be an artifact. I’m not saying this study is wrong, but that it needs to be replicated and studied carefully. But I do like the care they took to perform all the controls, with varying distance and intensity, and the comparison between the two. And I’m interested to hear more about how we process differences in distance, for something so important to our daily lives, it’s so fascinating to realize how little we know about it.

Kopco N, Huang S, Belliveau JW, Raij T, Tengshe C, & Ahveninen J (2012). Neuronal representations of distance in human auditory cortex. Proceedings of the National Academy of Sciences of the United States of America, 109 (27), 11019-24 PMID: 22699495

Scicurious About the Author: Scicurious is a PhD in Physiology, and is currently a postdoc in biomedical research. She loves the brain. And so should you. Follow on Twitter @Scicurious.

The views expressed are those of the author and are not necessarily those of Scientific American.

Rights & Permissions

Comments 2 Comments

Add Comment
  1. 1. jhewitt123 7:19 pm 07/13/2012

    It seems the question of distance localization might be intimately convolved at the logical and anatomical level with the question of directional localization. There may be hot spots where distance localization may be optimal. Localization directly in front may not be optimal due to the symmetry. When the pinna is altered experimentally there is a definite loss of directional localization, angular and azimuthal, and I would suspect distance localization is affected as well. I wonder if we can effect accurate localization with one ear, analogous to distance perception using one eye perhaps through some kind of triangularization across a single retina. Is the eardrum just a single auditory pixel or is it more of a 2d array that can serialize the acoustic wavefront onto the inner ear impedance matching bone system. Can the ear (or at higher intensities the whole body acoustic perception abilities) measure the angular profile of a point source? If so close sources would not be flat and could be gauged. Distant sources would be localized by alternative effects like echo and selective frequency attentuation contrasted against “known candles” to borrow the term from astronomical disnce measurement. Simplifying the experimental stimulus to a more soliton-like single pulse may help in studies.

    Link to this
  2. 2. 11:15 am 07/31/2012

    This is one of those questions already thoroughly answered by any sound engineer. Higher frequencies travel short distances while low frequencies further. Anytime you make a sound, there are high frequencies and low frequencies. The human brain is designed/adapted to give higher frequencies priority(3k-4k mHz) as those sounds are more likely closer and more imminent. If you want to make a sound sound closer or have more “presence”, you augment the higher frequencies on your equalizer.
    Here’s a very common example everyone can relate to. Say there’s a thunderstorm and you hear a really high pitched crack of lightening and your best buddy says, “Holy crap, man! That was like right on top of us!” But if all you heard was a really low rumble, you know the lightening is quite far away.

    Link to this

Add a Comment
You must sign in or register as a member to submit a comment.

More from Scientific American

Email this Article