Why should a receiving antenna be impedance matched to 50 ohm instead of a lower impedance?

However, what if you are only using your antenna to receive?

An antenna (e.g. monopole) has a complex impedance characteristic versus frequency: -

enter image description here

Picture from my answer here.

So, if you are tuned into a frequency that makes the antenna exactly a quarter wavelength (0.25 on the X axis above), it presents an impedance that is roughly 37 ohms resistive plus about 21 ohms inductive.

This means that you can maximize receive power transfer with a little bit of series capacitance to cancel the inductive reactance and, "produce" a receiving source having pure 37 ohms. Feed this into a matching 37 ohms to extract the most receive power.

But there's really nothing much else you can do.

A quarter wave monopole transforms the impedance of free space (roughly 377 ohms) down to 37 ohms when used at the "right frequency". Use it off-centre and the reactance can go positive (inductive) or negative (capacitive).

Therefore, following this logic, you'd ideally want your receiving antenna to have a very low impedance (as close to 0Ω as possible).

Well, you can do this - you can make the antenna intentionally "short" and it presents quite a bit of capacitive reactance and the radiation resistance will drop down to a fraction of 37 ohms (maybe 5 ohms). However, your received signal is also much smaller because the transformation ratio has dropped even lower. But, you can tune out the capacitance with series inductance and get a very highly electrically tuned antenna. Nice in some circumstances but bad in others.

Old crystal radios never had anything like a quarter wave monopole because if they did it would be nearly the length of 4 football fields (for the good old BBC long wave service at circa 200 kHz) BUT, the length was still fairly long and it's the length of the antenna that gives you the signal intensity (up to a certain point). It was still regarded as a "short" antenna though (electrically).

if you are using an antenna as a transmitter, then you'd want to make the antenna (load) impedance equal to the impedance of the source for maximum power transfer into the antenna - so you match your antenna to 50 Ω

You've got this a bit backwards - for a lot of set-ups, the antenna is somewhat distant to the transceiver and are connected via coaxial cable. Coax has a transmission impedance and, at that precise transmission impedance, signals are not reflected from a matching load. This is quite important. And so an antenna is usually matched to the coax but, an antenna (standard ones like monopoles and dipoles) don't look like 50 Ω so they are matched via a balun or resistor network at the antenna (load) end. This can mean that the transmitter end needn't match to anything like 50 Ω and, can drive its full power to the load.