Ultrasonic wave through air

There is a comprehensive article on absorption of ultrasonic waves in air in the article Absorption of ultrasonic waves in air by A. Vladišauskas and L. Jakevičius.

We're not supposed to just quote links in articles, but it seems silly to reproduce the entire paper here. The conclusions are that absorption is strongly affected by air temperature and pressure (though I'd guess you're only interested in normal pressure) and also by frequency. At low frequencies and temperatures (around zero C) the absorption is very small. However as the temperature increases the absorption rises with temperature. For low frequencies (c. 100KHz) the absorption peaks around 50C at about 5 dB/m. For frequencies around 1MHz the absorption is several hundred dB/m and doesn't show a peak below 100C.


The attenuation of sound in air is a function of frequency - the higher the frequency, the greater the attenuation per unit length.

There is also a natural limit to the amplitude of a sound wave: once the peak pressure is more than twice the ambient pressure, you no longer have a traditional "wave" since air pressure cannot go negative.

Third - "ultrasonic" frequencies start at 20 kHz: the limit of (good) human hearing.

Finally - when the sound pressure of a wave has reduced to 0 dB, one can reasonably say it has become "undetectable"; any number of ambient sound sources would add sufficient noise that it would require extremely senstivite equipment to have any chance of picking up the signal. 0 dB is the "limit of human hearing".

Putting these three things together, one could calculate the distance that a 20.1 kHz signal travels in ordinary air (20°C, 40% relative humidity) when the sound source at the origin has a pressure of 194 dB (the theoretical limit of undistorted sound). We will assume this pressure is measured at a distance of one wavelength (about 1 cm) from the transducer; then we have both the inverse square law and the attenuation law to contend with.

From inverse square law, we lose 6 dB for every doubling in distance. If that was the only mechanism for reducing amplitude, the 194 dB signal would drop to zero in (194/6 ~ 31) doublings of distance, or $2^{31}~\rm{cm}$. That is an enormous distance, so we need to look at the attenuation coefficient instead.

According to the paper that John Rennie already referenced "Absorption of ultrasonic waves in air", A. Vladišauskas, L. Jakevičius, ISSN 1392-2114 ULTRAGARSAS, Nr.1(50). 2004 the attenuation of 50 kHz ultrasound at 20 C and relative humididty of 60% is roughly 1 dB / m

This quickly gives us an absolute upper limit of approximately 200 m for any ultrasound signal transmission. "Real world" conditions will greatly affect the answer.