Is there any relationship between frequency of signal and distance it travels?

In general, yes, higher frequencies attenuate more the further distance they travel. There are two effects that are responsible for this.

First, higher frequency radio waves tend to be absorbed more readily by objects (ie: the penetration depth in the material is shorter). You may have noticed this effect with your home WiFi network. If you have a dual-band router, you probably have two networks, one at 2.4 GHz and one at 5 GHz. You may have noticed that the 2.4 GHz network reaches some corners of your house that the 5 GHz network does not - this is because the 2.4 GHz network penetrates walls better than the 5 GHz network does.

In terms of fifth-generation cell networks, this is the principal effect that limits range: people prefer to live in houses and apartment buildings, and those houses absorb more RF energy at higher frequencies.

The second effect is less intuitive, and falls out the Friis equation for free-space path loss.

$$ FSPL= 20\log_{10}(d) + 20\log_{10}(f) + 20\log_{10}\left(\frac{4\pi}{c}\right)$$

Where \$FSPL\$ is the free space path loss in dB, \$d\$ is the distance in meters, and \$f\$ is the frequency in hertz, and \$c\$ is the speed of light. Note that as \$f\$ increases, the loss increases: a doubling in frequency results in a 6 dB increase in path loss.

This equation would lead you to believe that free space in some way attenuates higher frequencies more than lower frequencies, but that's not really the truth, even though it's a convenient lie we engineers tell ourselves. The "free space path loss" is actually a geometric effect that captures the "spread" of the electromagnetic wave as it propagates in free space, the same as how the beam of a flashlight spreads out when it's shone out into a big dark room.

The reason it's frequency dependent is because it does not account for the gain of the transmitting and receiving antenna - they are assumed to be isotropic. An isotropic antenna at a higher frequency is physically smaller than one at a lower frequency, which means that geometrically they have a smaller effective aperture.

If you keep the effective aperture of both antennas at both ends of the link the same physical size as you go up in frequency, you actually see that your path loss is independent of frequency. However, you would also find that the antennas are now directional: as you increase the aperture size, the transmitting antenna shoots a tighter and tighter beam, and the receiving antenna can only receive signals from a narrower and narrower cone. Since engineers usually think in terms of preserving a given antenna pattern and not in aperture size (since pointing antennas can be problematic), we commonly consider free space to have "loss" that increases with frequency when in reality that isn't quite the case.