2nd Pan-American/Iberian Meeting on Acoustics


[ Lay Language Paper Index | Press Room ]

Do We Need Two Ears to Perceive the Distance of a Sound Source in a Room?



Arnaud Bidart
Mathieu Lavandier -
Université de Lyon, ENTPE-DGCB (CNRS)
Rue M. Audin, 69518 Vaulx-en-Velin Cedex, FRANCE


Popular version of paper 3aAAb5
Presented Wednesday morning, November 17, 2010
2nd Pan-American/Iberian Meeting on Acoustics, Cancun, Mexico



The aim of the present study was to understand which sound characteristics humans use to evaluate a source distance in a room. Outdoors, where the only sound reaching a listener is coming directly from its source, sound intensity can be used to evaluate source distance. This evaluation is relative: the listener needs to know a priori the sound level produced by the source, or he/she can only compare different distances assuming a constant source level. In a room, the listener does not receive only the direct sound coming from the source, but also all the sound reflections on the room boundaries, so called reverberation or reverberant sound (Figure 1). Intensity is then a less relevant cue for the listener to evaluate the source distance because, if the source is sufficiently far away from the listener, the intensity of the sound it produces at the listener ears is mainly determined by the level of the reverberant sound, which is independent of source distance (whereas the direct sound level decreases with source distance). Despite the loss of this intensity cue, the absolute evaluation of source distance is possible in a room, even if humans are not very accurate at this task.

Figure 1. Direct (red) and reverberant (blue) sounds in a room.

The direct-to-reverberant energy (D/R) ratio has been proposed as the underlying cue for this evaluation. This acoustical attribute is obtained by comparing the direct sound energy to the reverberant sound energy. It decreases systematically with source distance, following the decrease of direct sound energy, and it is also independent of source level (both direct and reverberant sounds are proportional to the source level, which disappears from the ratio). The D/R ratio can be calculated from the room impulse response (the sound measured in the room for a source emitting an impulse, so that direct and reverberant sounds can be easily separated), but listeners do not have access to this room response when they listen to sounds which are not impulses. Listening to a stationary noise source for example, how could the listener differentiate the direct and reverberant sounds in order to compute the D/R ratio? If it is the basis of distance perception, how is the D/R ratio extracted from real-life signals? Moreover, previous studies have shown that distance perception at fixed D/R ratios can depend on the source type and direction, on the laterality of room reflections, and on the similarity of the signals at the two ears (in one study, distance perception disappeared for signals identical at the ears). The D/R ratio, a monaural attribute, cannot predict these effects. The source interaural coherence evaluating the similarity of the signals produced at the ears might then be an alternative cue for auditory distance.

This question was investigated in a series of listening experiments, in which perceived distance was measured in a virtual room using headphones (Figure 2). The room was simulated using room-acoustics software, which allows generating the sound produced by a source at a listener ears, as if the source and listener were at any given position within the room. The resulting signals were reproduced over headphones in a double-wall sound proof booth, so that the signals presented to listeners could be rigorously controlled. These signals were all equalized in level to eliminate the potential use of intensity cues and focus on reverberation cues. After each sound presentation, the listener was asked to report the perceived distance of the source. Thirty listeners participated in each experiment. Mean standardized results were considered; discarding the potential differences of scale used by listeners (we were interested in differences in perceived distance rather than in absolute distance evaluation).

Figure 2. Schematic of the experimental protocol.

Three source types were considered: bursts of noise, speech (one sentence “Toute la nuit” meaning “All night long”) and clicks. Six distances were simulated, ranging from 0.5 to 10 m. Two listening modes were tested: diotic listening, where only the signal arriving to one ear (left or right) is kept and sent to both headphones, and binaural listening, where the left/right ear signals are sent to the left/right headphone respectively. For binaural signals, D/R ratio and interaural coherence both decreased with increasing distance: the further away the source, the more important the relative contribution of reverberant sound (compare to the direct sound), and the more different the signals at the two ears (room reflections tend to be different at the ears). For diotic signals, however, coherence was constant and maximum (identical signals at the ears) whereas D/R ratio still decreased with increasing distance. Comparing the two listening modes thus allowed teasing apart the two acoustical attributes to better understand the nature of auditory distance perception.

Figure 3. Perceived distance as a function of simulated noise source distance, for binaural and diotic (right or left ear signal) listening.

The main result of the study is that listeners did not need two ears to evaluate the sound source distance in our experiments: binaural and diotic signals led to the same perceived distance for stationary signals such as noise (Figure 3). Distance perception was not disrupted for sounds identical at the two ears; it was monaural and based on room colouration. Because materials on room boundaries tend to absorb more high frequencies than low frequencies, the reverberant sound contained more energy at the low end of the spectrum. Then, when the simulated source was moved away from the listener and the relative contribution of the reverberant sound to the overall spectrum at the ears increased, the spectral balance of sounds bended towards low frequencies. This spectral attribute was the predominant cue used by listeners.

When comparing distance evaluations with noise and “cut” noise from which the reverberant tail at the end of the sound was eliminated (Figure 4), it appeared that cut noise was perceived closer. Reverberation tails and the associated reverberance (the perceived level of reverberation) also played a role in distance perception: the more reverberant the sound (the longest the reverberant tail), the further away the source was perceived. Reverberation tails are heard on the transient parts of signals, so they play a more important role for speech than noise. Because our binaural system is able to partly de-reverberate a signal, so called binaural squelch, binaural speech was perceived less reverberant than diotic speech, thus it was perceived closer (Figure 5). Two ears are not required for auditory distance perception; they even slightly reduce the distance perception associated with reverberant tails and reverberance. Finally, it was difficult to interpret the experiment with clicks, because intensity cues might not have been eliminated for these particular signals.

Figure 4. Comparison of perceived source distance for a noise with (red) and without (green) a reverberant tail.

Figure 5. Same as Figure 3, but for a speech source instead of a noise source. 

[ Lay Language Paper Index | Press Room ]