Why is power reflected when there is an impedance mismatch?

"Reflected power" is a concept that is most appropriate when the length of the cable between "source" and "receiver" is of the same magnitude as the wavelength of the signal being transmitted. Empirically, it is argued, that if the cable length is (say) smaller than one-tenth of the signal wavelength (including harmonics) then "reflections" in most solutions can be ignored.

So that's the first point - calculating reflections for a signal of (say) 1MHz (300 metre wavelength) over a mismatched cable of less than 30 metres length is probably pointless in practical terms even for a totally unmatched impedance.

But, the 1MHz signal could be a square wave and you might be wanting to preserve the shape of that square wave. This creates a new problem - a square wave comprises an infinite number of diminishing odd harmonics and, on the face of it, any length of cable that is badly terminated is going to produce reflections. However, because engineers are usually practical people we say "never mind those harmonics above (say) 5 times 1MHz".

We say this because if the 7th harmonic gets all messed-around, we know that the basic shape (and detectability) of the 1MHz sq wave is still good.

All of this means you have to consider the full bandwidth of the signal and limit the analysis to a reasonable bandwidth.

OK that's all the preamble and disclaimers. So, you have a transmission line (coax, twisted pair, waveguide or even a vacuum) and you want to know why power can be reflected. Consider this: -

You have a battery and a switch feeding a lamp down 50 metres of cable. When the switch is closed a current starts to flow down the cable - that current is determined not by the load but by the characteristic impedance of the cable. Let's say it's 50 ohms and the battery is 10V. This means a power is travelling down the cable of 10²/50 watts (power = V²/R) and current is 200mA.

This power gets to the far-end of the cable and it expects to see a load that is the right impedance i.e. 50 ohms but, let's say the impedance of the lamp is 100 ohms. If we analysed things a little naively, the voltage that reaches the lamp (ignoring cable losses) might be considered to be 10V. This means that the lamp would draw a current of 100mA. But, there is 200 mA of current reaching the lamp (100 mA too much) so, what happens to the rest of the power? What has happened to ohm's law?

If you did the math you'd make this power equation: -

$$\dfrac{V_F^2}{Z_0} - \dfrac{V_R^2}{Z_0} = \dfrac{V_L^2}{Z_L}$$

Where \$V_F\$ is the forward (load-bound) voltage, \$V_R\$ is the voltage reflected from the load (now source-bound) and \$V_L\$ is the load voltage resulting from the difference between \$V_F\$ and \$V_R\$ hence: -

$$\dfrac{V_F^2}{Z_0} - \dfrac{(V_F-V_L)^2}{Z_0} = \dfrac{V_L^2}{Z_L}$$

And, if you drilled this down you'd find that: \$\hspace{1cm}V_L = V_F\cdot\dfrac{2\cdot Z_L}{Z_0+Z_L}\$

So, you could then make a less naive calculation for initial load voltage. Using the numbers above, you'd find that \$V_L\$ is 13.333 volts on the first reflection and load current is initially 133.33 mA. So, 66.666 mA is reflected along with 3.333 volts (the difference voltage). \$V_R\$ and \$I_R\$ reflections have an impedance ratio of (exactly) 50 ohms hence, they are compatible to flow back up the 50 ohm cable to the source. Ohm's law is rescued.

The excess power (in order to prevent a violation of ohm's law) is reflected back up the cable to the battery. With zero losses in the battery, switch and cable the excess will slosh back and forth forever but, cable losses (and load dissipation) diminish it (over a time scale of a few microseconds). Eventually, everything settles down to the standard DC scenario of 100mA flowing from 10V source (1 watt) into a lamp that has a resistance of 100 ohms.