Do you need more than 128-bit entropy?

128 bits of entropy are enough. The whole and only point of considering entropy is to make sure that the system can resist brute force attacks: the space of possible values must be so large that any attacker could only try a negligible proportion of the values in non-ludicrous time. There are strong reasons why 128 bits are highly sufficient for that. Realistically, a very powerful organization with lots of cash to spare (including behemoths like Apple or Google, who can muster much more resources than usual scarecrows like the NSA) could hope to perform, say, at most 285 elementary computations (like encrypting an AES block) within a year -- it won't be discreet, and that's still less than the millionth part of the millionth part of a 2128 space.

In your list of algorithm, if you generate a 256-bit AES key from a 128-bit seed and a strong PRNG, then you do not really have a 256-bit key. From the attacker's point of view, this is a 128-bit key. Mind you, that's still more than enough to deter the attacker. But claiming "256 bits" here verges on the fraudulent (and, in the case of AES, implies a +40% runtime cost for no actual gain). The same can be said about HMAC, and, actually, all algorithms that are part of "symmetric cryptography".

For asymmetric algorithms, one can try to make estimates of strength and to seek equivalences with symmetric keys. This is very hard to do, because attacks on (say) RSA and AES do not use at all the same kind of technology (to make things simple: for RSA you need a lot of very very fast RAM; for AES you need no RAM at all, and should use relatively slow circuits to make them really cheap to build and operate). See this site for what several groups of smart people have come up with. The general consensus is that a 256-bit elliptic curve (at least, a "normal" curve) ought to be somewhat equivalent to a 128-bit symmetric key in strength; a 4096-bit RSA key might be a bit beyond, assuming that this assertion makes sense, which it does not (there is nothing stronger than "cannot break it" so comparing strengths beyond 100 bits or so is kinda meaningless anyway).


There is a relatively recent idea that floats around about quantum computers, that would somehow require a doubling of the number of bits. The theoretical reason is that Grover's search algorithm can explore a space of 2n inputs to a function (implemented on a quantum computer) by invoking the function 2n/2 times. Under that theory, you would need a 256-bit key to achieve the magical "128-bit security". However, there are two principal flaws in that theory:

  • The flaw that everybody points out is that quantum computers do not exist yet (well, almost everybody; some people purport to have quantum computers on sale). Decoherence is apparently very hard to surmount and we do not really know how to do it except by making the whole machine very very cold, and then praying that the Universe won't catch up on what we are trying to do for a few more milliseconds.

  • The flaw that nobody points out, but is at least as important as the other one, is that there is no indication that an elementary computation on a quantum computer would consume an amount of energy similar to an elementary computation on a classical computer. This is, however, a very crucial point. Since the invention of the transistor, computers have steadily become more efficient; it was theorized as Moore's law. There are also increasingly clearer indications that Moore's law is about to stop (Gordon Moore himself said so), mostly because we are close to the minimal size for electronic gates (if we make them any smaller then they leak madly through quantum tunnelling) and we are out of fresh ideas: the increase in power sine the 1970s fed on a pool of ideas that had all been conceived and enunciated in the 1970s, and now we are at the end of our wits.

    So we will soon reach the limits of classical computing and thus will actually know, with some appreciable reliability, just how much of a key space can be explored with classical computers. However it took about 70 years to reach that point.

    For quantum computers, since they don't really exist yet, we are at day one. It takes an awful lot of wishful thinking to assert that, 70 years from now, we will have reached the maximal optimization of quantum computers and that optimization will bring the cost of elementary quantum operations within the same order of magnitude of the cost of elementary classical operations. In fact nobody knows anything about that yet.

As an illustration on the difficulty of conceiving a clear notion of what technology will be able to achieve in the future, consider the following anecdote (that I recently read in that book): back around 1810, more than 25 years after the historical first free flight of Pilâtre de Rozier in an untethered balloon, the greatest thinkers of that time had conceived what the new invention could bring to mankind. In their view, the summit of the usefulness of a balloon was to tether it to your carriage, thereby making it lighter and allowing it to be drawn about with fewer horses. Think about that whenever you feel visionary.

Quantum computer efficiency being doubly speculative at that point, it is not really rational to include it in any thinking about key length. A much more salient point is that when quantum computers work, even if they do so abysmally slowly, then RSA, Diffie-Hellman and elliptic curves are toast. That is worth investigating, and this is what studies on Post-Quantum cryptography concentrate on.