How efficient is a desktop computer?

Assuming a typical computer with CPU processing power ~1 GHz. It means that it can generate output byte sequence at ~$10^9$ byte/s, which is about ~$10^{-13}$ J/K in terms of von Neumann entropy. Also, the power consumption of a typical CPU is ~100 W, which gives entropy ~0.3 J/K at room temperature.

So the (minimum ΔS) / (actual ΔS) ~ $10^{-14}$

This calculation is not quite right because it is hard to determine what is the actual output of a computer. In most case, the previous output will be used as input later. The above calculation has also made the assumption that all output is continuously written in some external device.

A better point of view is that each gates taking two inputs and one output, such as AND, OR, NAND, ..., must drop one bit to the surrounding as heat. This is the minimum energy $W$ required to process information in a classical computer. In this sense, we may define the efficiency as $e = W/Q$, where $Q$ is the actual heat generation per second.

The efficiency depends on how many such logical gates that will be used, but I guess it is less than thousand in a typical clock rate, so $e \approx 10^{-11}$.

It means that our computer is very low efficiency in terms of information processing, but probably good as a heater. This theoretical minimum energy requirement is also hard to verified by experiment because of the high accuracy required.


Some info from ASIC world: For example, you processor have 300 mil. transistors, and most of these do some work. But, in order to make for example pure 32-bit add operation you need just about 1000 of them. Others are for caching and passing data back and forth - support functions which are impossible to estimate. So estimations from math side are very hard to make.

Modern process design already targets for energy per switch, and it's being optimized. Unfortunately, the slower speed you need - the more efficient processor runs. For example, to get 50% of speed you can spend just about 10% of power.

So they are deadly inefficient (and there is still room for 100-10000 times improvement), but estimating CPU as a whole is wrong. You should take into account only minimal computing unit used, like summator, you cannot predict how many switches you would have in support logic which takes 98% of chip area.


For all practical purposes today the answers above are very informative.

However, as Marek has pointed out above, your fundamental theoretical model of the thermodynamics of computation, on which you are basing the question is, surprisingly, wrong, as we first began to discover 50 years ago (see refs. to Landauer Charlie Bennet, Friedkin, others). Actually, all computations are in principle, dissipation free, except for the dissipation required to overwrite or forget previously stored bits.

The classic example is this. Suppose you want to compute the next to the last binary digit of the zillionth prime or some such. Then you do so, slowly and reversibly, carefully not overwriting any of the intermediate bits you generate, which requires a lot of space. Perhaps you even make use of quantum entanglement in the computer. Then you write the answer, by overwriting (irreversibly forgetting) the single bit of the answer in some (say external) register. Then you can reverse the original computation, also without any dissipation at all. You are left having to dissipate only the entropy necessary to overwrite the 1 bit of the recorded answer, because that is the only information you were forced to forget.

Since the denominator approaches zero, in theory, the theoretical answer to your question is infinity. There is a tradeoff with space to hold all the intermediate results. This is a surprise, a shock really, but it shows the power of clear thought. It is intimately connected with quantum computing, but also has entirely classical models.

So the right theoretical way to ask your question would be more like, for a particular computation, to be completed in a time t, operating with a limited memory of x bits, what is the necessary dissipation. I'm no expert, but will try to get more refs. PS. The resting brain probably uses about 20 Watts.