Why are DIMMs not equipped with a heat sink like a CPU?

You're assuming that the power dissipation is directly related to the clock rate. That's true but there's more.

Suppose I have this chip A where only 10% of the chip area (die size) runs at the highest clock rate. Compared to a chip B of equal size where 100% of the circuits are running at the high clock rate, chip A would dissipate only about 1/10th of the power that chip B dissipates.

My point: not only the clock rate matters, also how much of the chip is actually running at that clock rate.

For a DRAM chips (PC DIMMs use DRAM) most of the area on the chip is DRAM cells (obviously) and these are run at a significantly lower speed than the external clock rate. The DRAM controller access the chips in parallel and in a sequence so that this lower speed is somewhat compensated for by parallelism.

On a CPU a much larger part of the circuits actually run on the maximum clock rate (depending on how busy the CPU is of course) so it is bound to dissipate a lot more power than a DRAM chip where only a small part of the chip is running very fast.


DIMMs don't dissipate the same power a CPU does, so they don't need the same cooling. In addition, the power the memory and control chips do dissipate is much more spread out physically.

Power dissipation may be roughly proportional to clock rate, but that proportionality constant is quite different between a CPU and a memory. The CPU has many more transistors and gates switching at the clock transitions than the memory does.

Remember that for CMOS, by the time you get to current being roughly proportional to clock speed, the dominant current is charging and discharging all the little parasitic capacitors on the outputs of every gate. If you have fewer gates changing state, then there is lower current, which results in lower dissipation at the same clock rate.


You need a heatsink if your component produces more heat than it can dissipate through its own package. Heat is electricity converted to a change in temperature of some mass

Now, in a modern CPU, what uses electric energy is mainly the process of switching a transistor. Every single transistor switching costs energy, and the faster that switching has to happen increases the amount of energy per switching.

Now, for every clock cycle, your CPU does a lot complicated things like multiplying numbers, caclulating addresses, speculating what the next operation might compute before that actually happens, and so on. Those operations lead to a lot of transistors switching at once.

A DRAM chip (like the one on your DIMMs) is different in that there's no complex operations to do – it's just memory, which means that it basically has to switch about (word length)×(memory address bits) – so, really, less than 2000 transistors for a single chip (there's a bit of address and command decode overhead, but that's very "cute" compared to the complexity of a CPU). Sure, the things these transistors switch need more energy (because that charging and discharging relatively large capacitors, whose charge is the actual bit), but it's really very few transistors only.

Then, DRAM also needs to be periodically refreshed, but that happens every few milliseconds or so only, so only every couple million memory clock cycles – and hence doesn't contribute greatly to overall power consumption.