Is it possible for a sound to be louder as you move away from it?

The only way that I can envision a sound getting louder as you walk away is if you happen to initially be located at a node where destructive interference causes the waves to be near zero amplitude.

When the waves are exactly out of phase at the point you stand, you will not hear anything. As you move in any direction away from said point, it will get louder. This of course requires either two sound sources or a solid boundary which reflects the original wave with a phase shift (an echo, but you ruled that out as a possibility). And then carefully positioning yourself to begin the experiment so that you are at a destructive node.


No, but it can sound like it.

If there are two sounds that have similar frequencies, we will only hear the loudest one (this phenomenon is used to help compress music files). This is caused auditory masking. Now, if you have a loud sound far away, and a less loud sound close to you, the one close to you will sound louder, and will mask the farther away one. If you move away from both of them, then because of the inverse square law the near sound will get softer faster than the far-away one, so the far-away sound will no longer be masked. This may sound to you like the farther-away sound is getting louder as you move away from it. (Even though it isn't in reality.)


I think that in general, as you move away from a sound it gets softer due to the dissipation of energy. However, I can see possibilities using exotic configurations of air density where the sound does get louder. For example, imagine that the air density increases while retaining the same bulk modulus. Then, as you move away from the source, the sound velocity will drop, allowing a buildup of sound pressure, just like in a sonic boom.

Another, more subjective possiblity, is that the sound experiences nonlinear effects such as frequency-dependent modulation and disperson. In that case, a powerful sound source emmitting sound above your hearing threshold would be inaudible up close, but due to nonlinear effects, more of that high energy sound is modulated downwards to audbile frequencies.

From OPs Comment

If your sources (A and B) have amplitudes $a,b$ at $r_A=r_b=1$ then assuming we move a distance $\delta$ from both A and B simultaneously we have the formulas:

$I_A(\delta)=\frac{a}{(r_A+\delta)^2},I_B(\delta)=\frac{b}{(r_B+\delta)^2}$ assuming the inverse square energy dissipation. What we want to know is how the intensities change as we move away from both sources, if $r_A>r_B,\frac{a}{r_A^2}>\frac{b}{r_B^2}$. One way to look at this is the ratio of the two intensities as you move away:

$\frac{I_A(\delta)}{I_B(\delta)} = \frac{a(r_B+\delta)^2}{b(r_A+\delta)^2}=\frac{a}{b}(\frac{r_B+\delta}{r_A+\delta})^2, \\ \lim\limits_{\delta\rightarrow \infty}\frac{a}{b}(\frac{r_B+\delta}{r_A+\delta})^2 = \frac{a}{b}$

Therefore, if you precieve A as louder than B, then moving away from both will never let B be louder than A, let alone let the sound overall get louder.

Anyway, just a couple out there thoughts.