Why use DDR instead of increasing clock speed?

With SDR, there are two clock edges per bit, but only at most one edge on the data line.

With high frequency communication, the analog bandwidth limits how close you can put edges together on any given wire. If your clock signal hits that limit, you're wasting half of the bandwidth of the data wires.

Therefore, DDR was invented so that all of the wires hit their bandwidth limit at the same bit rate.

The real problem is bandwidth. The highest frequency that a data line can generate (well, not counting slew rate) is when it's sending a 101010 data pattern, which occurs at half of the data rate. With single data rate (SDR) transmission, the clock produces one complete cycle for each data bit, hence running at double the frequency of what you might see on a data line in the worst case. Double data rate runs the clock at half the data rate with one edge per data bit, hence the worst case data pattern produces the same frequency as the clock.

Generally the speed of an interface will be limited by the available bandwidth through the chip packages, pins, board, connectors, etc. If the clock requires double the bandwidth as the data, then the high frequency of the clock signal will limit the overall bandwidth of the link. With DDR, the required bandwidth is the same for the clock and the data, enabling the link to more efficiently utilize the available bandwidth.

The downside of using DDR is that it's more difficult to design. Flip flops used to capture the data bits on the receive side operate on one clock edge, either the rising edge of the falling edge. The data has to be stable at the input for a setup time before the edge and a hold time after the edge in order to be reliably latched in. With SDR, the clock can simply be inverted somewhere to meet the timing requirements. However, with DDR, a 90 degree phase shift is required, which is more difficult to generate, requiring PLLs or delay lines.

So, to summarize:


  • Pro: Simple to implement
  • Con: Inefficient bandwidth utilization as clock signal requires twice as much bandwidth as the data signals


  • Pro: Efficient bandwidth utilization as all signals require the same bandwidth
  • Con: Complex to implement