Is there a loss of entropy by hashing an N-bit random key to produce an N-bit hash?

Hashing is a deterministic process which means that it can never increase the randomness. But of course it can decrease the randomness: if you hash a 200 bit random value with some hash algorithms which only outputs 160 bits (like SHA-1) then of course the resulting value can never have 200 bits randomness.

But as long as the number of input bits is significantly lower than the output size of the hash it will not reduce the randomness, providing a cryptographic hash is used. If the input size is exactly the same as the input size as in your example the resulting randomness is likely not significantly decreased when using a cryptographic hash. And you are right that collision resistance does not matter for this.


dd if=/dev/urandom bs=16 count=1 2>/dev/null | md5sum

This is guaranteed to lose some entropy, but not much. The difficulty in attacking MD5 doesn't directly suggest the amount of loss, but merely tells us it is not zero.

If I fall back on naïve construction, I find that the entropy loss is can be computed from the fixed point probability of 63.21%. which in turn implies that less than one bit of entropy has been lost. I forget how much it is said you actually need, but I know 122 bits of true entropy is plenty.

On the other hand, what are you trying to accomplish here? dd if=/dev/urandom bs=16 count=1 is plenty by itself. There is no point in trying to use md5 in defending kekkak-f against weakness.

If you really feel the need to do something here, make a 128 bit application secret key, generate it in install time directly from /dev/random (not urandom), and xor it into the feed from /dev/urandom. But I don't see a single reason to lift a finger to do this.