Why do power grids tend to operate at low frequencies like 60 Hz and 50 Hz?

Higher frequencies are much more affected by the inductance of the power lines. 400 Hz is fine on an aircraft, but over long distances the power factor would be extremely poor. 60 Hz was an educated guess (as I understand), but it has turned out to be about right.


What @Frog says about losses is true, however, that's not the real reason for utility frequency to be around 50-60Hz. HVDC systems have essentially no reactive losses, yet they did not really become widespread.

The choice of utility frequency is largely historical, and frequencies outside the 25-100Hz range were simply prohibitive around 1900 from the technology point of view: 25 Hz and lower were too low for most consumer applications and required bulky generators and transformers, and 100Hz and higher frequencies could only be generated with belt-driven generators which were already being replaced by direct-coupled alternators due to higher reliability of the latter.


The frequency of the power grid is a great compromise.

Make the frequency higher and you get smaller (read: cheaper) transformers and (somewhat) smaller generators and motors. But, you get higher hysteresis loses in the transformers' cores and higher radiative loses in long power lines as well.

The above consideration about 100 years ago ended up with the conclusion that frequencies 20-100Hz are OK-ish. US engineers started using 60Hz as they liked the number for being the same for other time divisions (seconds vs minutes vs hours). European engineers (1-2 years late to the party) liked 50 Hz better for being a multiple of 5 and 2 only, just like other unit divisions they used.

Other, independent power grids use other frequencies (like Swiss railways at 16Hz) because they fit their purposes better.

Skin effect is not really a consideration as most power conductors are made out of a number of smaller wires for mechanical reasons anyway.