Are real and complex analysis related in any way?

This is an interesting question, but one that might be hard to address completely. Let me see what I can do to help.


Real Power Series:

The easiest way to address the connection between the two subjects is through the study of power series, as you have already had alluded to you. A power series is a particular kind of infinite sum (there are many different kinds of these) of the form $$ f(x) = \sum_{k=0}^{\infty}a_k x^k. $$ They get their name from the fact that we are adding together powers of $x$ with different coefficients. In real analysis the argument of such a function (the "$x$" in $f(x)$) is taken to be a real number. And depending on the coefficients that are being multiplied with the powers of $x$, we get different intervals of convergence (intervals on which the sequence of partial sums converges.) For example, if $a_k$ is the $k$th Fibonacci number, then the radius of convergence ends up being $1/\phi$, where $\phi = (1+\sqrt{5})/2$ is the "golden ratio".

The most common kind of power series that come up in calculus (and real analysis) are Taylor series or Maclaurin series. These series are created to represent (a portion) of some differentiable function. Let me try to make this a little more concrete. Pick some function $f(x)$ that is infinitely differentiable at a fixed value, say at $x=a$. The Taylor series corresponding to $f(x)$, centered at $a$, is given by $$ \sum_{k=0}^{\infty}\frac{f^{(k)}(a)}{k!}(x-a)^k. $$ A Maclaurin series is just a Taylor series where $a=0$. After playing around with the series a bit, you may notice a few things about it.

  1. When $x=a$, the only non-zero term in the series is the first one: the constant $f(a)$. This means that no matter what the radius of convergence is for the series, it will at least agree with the original function at this one point.
  2. Taking derivatives of the power series and evaluating them at $x=a$, we see that the power series and the original function $f(x)$ have the same derivatives at $a$. This is by construction, and not a coincidence.
  3. If the radius of convergence is positive, then the interval on which the series converges will converge to the original function $f(x)$. In this way, we can say that $f(x)$ "equals" it's Taylor series, understanding that this equality may only hold on some interval centered at $a$, and perhaps not on the entire domain that $f(x)$ is defined on. You've already seen such an example with $(1+x^2)^{-1}$. In some extreme cases, such as with $\sin x$ and $\cos x$ as mentioned above, this equality ends up holding for ALL real values of $x$, and so the convergence is on all of $\mathbb{R}$ and there is no harm completely equating the function with it's Taylor series.
  4. Even if the radius of convergence of a particular Taylor series (centered at $a$) is finite, it does not mean you cannot take other Taylor series (centered at other values than $a$) that also have a positive radius of convergence. For example, even though the Maclaurin series for $(1+x^2)^{-1}$ does not converge outside of the interval of radius 1, you can compute the Taylor series centered at $a=1$, \begin{align} \frac{1}{1+x^2} &= \sum_{k=0}^{\infty} \left( \frac{1}{k!} \frac{d^k}{dx^k}\left(\frac{1}{1+x^2}\right)\Bigg|_{x=1}(x-1)^k \right) \\ &= \frac{1}{2}-\frac{1}{2}(x-1)+\frac{1}{2}(x-1)^2-3(x-1)^4+\cdots, \quad \text{for $|x-1|<\sqrt{2}$}. \end{align}

You will then find that this new power series converges on a slightly larger radius than 1 (that's the $\sqrt{2}$ mentioned above), and that the two power series (one centered at 0, the other centered at 1) overlap and agree for certain values of $x$.

The big take away that you should have about power series and Taylor series is that they are one and the same. Defined a function by a power series, and then take it's Taylor series centered at the same point; you will get the same series. Conversely, any infinitely-differentiable function that has a Taylor series with positive radius of convergence is uniquely determined by that power series.

This is where complex analysis beings to come into play...


Complex Power Series:

Complex numbers have many similar properties to real numbers. They form and algebra (you can do arithmetic with them); the only number you still cannot divide by is zero; and the absolute value of a complex number still tells you the distance from 0 that number is. In particular there is nothing stopping you from defining power series of complex numbers, with z=x+iy: $$ f(z) = \sum_{k=0}^{\infty}c_k z^k. $$ The only difference now is that the coefficients $c_k$ can be complex numbers, and the radius of convergence is now relating to the radius of a circle (as opposed to the radius of an interval). Things may seem exactly like in the real-valued situation, but there is more lurking beneath the surface.

For starters, let me define some new vocabulary terms.

  • We say that a complex function is complex differentiable at some fixed value $z=w$ if the following limit exists: $$ \lim_{z\to w}\frac{f(z)-f(w)}{z-w}. $$ If this limit exists, we denote the value of the limit by $f'(w)$. This should look familiar, as it is the same limit definition for real-valued derivatives.
  • In the same way that we could create Taylor series for real-valued functions, we can also create Taylor series (center at $z=w$) for complex-valued functions, provided the functions have an infinite number of derivatives at $w$ (sometimes referred to as being holomorphic at $w$): $$ \sum_{k=0}^{\infty}\frac{f^{(k)}(w)}{k!}(z-w)^k. $$ If the Taylor series defined above has a positive radius of convergence then we say that $f$ is analytic at $w$. This vocabulary is also used in the case of real-valued Taylor series.

Perhaps the biggest surprise in complex analysis is that the following conditions are all equivalent:

  1. $f(z)$ is complex differentiable at $z=w$.
  2. $f(z)$ is holomorphic at $w$. That is $f$ has an infinite number of derivatives at $z=w$. In real analysis this condition is sometimes referred to as being "smooth".
  3. $f(z)$ is analytic at $z=w$. That is it's Taylor series converges to $f(z)$ with some positive radius of convergence.

This means that being differentiable in the complex sense is a much harder thing to accomplish than in the real sense. Consider the contrast with the "real-valued" equivalents of the points made above:

  1. In real analysis there are a number of functions with only finitely many derivatives. For example, $x^2 \sin(1/x)$ has only one derivative at $x=0$, and $f'(x)$ is not even continuous, let alone twice differentiable.
  2. There are also real-valued functions that are smooth (infinitely-differentiable) yet do not have a convergent Taylor series. An example is the function $e^{-1/x^2}$ which is smooth for every $x$, but for which every order of derivative at $x=0$ is equal to zero; this means every coefficient in the Maclaurin series is zero, and the radius of convergence is zero as well.

These kinds of pathologies do not occur in the complex world: one derivative is as good as an infinite number of derivatives; differentiability at one point translates to differentiability on a neighborhood of that point.


Laurent Series:

A natural question that one might ask is what dictates the radius of convergence for a power series? In the real-valued case things seemed to be fairly unpredictable. However, in the complex-valued case things are much more elegant.

Let $f(z)$ be differentiable at some point $z=w$. Then the radius of convergence for the Taylor series of $f(z)$ centered at $w$ will be the distance to the nearest complex number at which $f(z)$ fails to be differentiable. Think of it like dropping a pebble into a pool of water. The ripples will extend radially outward from the initial point of differentiability, all the way until the circular edge of the ripple hits the first "singularity" -- a point where $f(z)$ fails to be differentiable.

Take the complex version of our previous example, $$ f(z) = \frac{1}{1+z^2}. $$

This is a ration function, and will be smooth for all values of $z$ where the denominator is non-zero. Since the only roots of $z^2 + 1$ are $z = i$ and $z = -i$, then $f(z)$ is differentiable/smooth/analytic at all values $w\neq \pm i$. This is precisely why the radius of convergence for the real-valued Maclaurin series is 1, as you've already noted: the shortest distance from $z=0$ to $z=\pm i$ is 1. The real-valued Maclaurin series is just a "snapshot" or "sliver" of the complex-valued Taylor series centered at $z=0$. This is also why the radius of convergence for the real-valued Taylor series increases when you move away from zero; the distance to $\pm i$ becomes greater, and so the complex-valued Taylor series can converge on a larger disk.

So now should come the question: when exactly does a complex function fail to be differentiable?

Without going into too many details from complex analysis, suffice it to say that complex functions fail to be differentiable when one of three things occurs:

  1. The function has a "singularity" (think division by zero).
  2. The function is defined in terms of the complex conjugate, $\bar{z}$. For example, $f(z) = |z|^2 = z\,\bar{z}$.
  3. The function involves logarithms or non-integer exponents (these two ideas are actually related).

Number 2. is perhaps the most egregious of the three issues, and it means that functions like $f(z) = |z|$ are actually differentiable nowhere. This is in stark contrast to the real-valued version $f(x) = |x|$, which is differentiable everywhere except at $x=0$ where there is a "corner".

Number 3. is actually not too bad, and it turns out that these kinds of functions are usually differentiable everywhere except along certain rays or line segments. To get into it further, however, will take us too far off course.

Number 1. is the best case scenario, and is the focus of this section of our discussion. Essentially, singularities are places where division by zero has occurred, and the extend to which something has "gone wrong" can be quantified. Let me try to elaborate.

Consider the previous example of $f(z) = (1+z^2)^{-1}$. Again, since the denominator can factor into the product $(z+i)(z-i)$, then this means we could "erase" the singularity at $z=i$ by multiplying the function by a copy of $z-i$. In other words if $$ g(z) = (z+i)f(z) = \frac{z-i}{1+z^2}, $$ then $g(z) = z+i$ for all $z\neq i$, and $\lim_{z\to i}g(z) = 2i$ exists, and is no longer a singularity.

Similarly, if $f(z) = 1/(1+z^2)^3$, then we again have singularities at $z=\pm i$. This time, however, multiplying $f(z)$ by only one copy of $z-i$ will not remove the singularity at $z=i$. Instead, we would need to multiply by three copies to get $$ g(z) = (z-i)^3 f(z) = \frac{(z-i)^3}{(1+z^2)^3}, $$ which again means that $g(z) = (z+i)^3$ for all $z\neq i$, and that $\lim_{z\to i}g(z) = (2i)^3 = -8i$ exists.

Singularities like this --- ones that can be "removed" through the use of multiplication of a finite number of linear terms --- are called poles. The order of the pole is the minimum number of liner terms need to remove the singularity.

The real-valued Maclaurin series for $\sin x$ is given by $$ \sin x = \sum_{k=0}^{\infty} \frac{(-1)^{k}}{(2k+1)!}x^{2k+1}, $$ and has a infinite radius of convergence. This means that the complex version $$ \sin z = \sum_{k=0}^{\infty} \frac{(-1)^k}{(2k+1)!}z^{2k+1} = z - \frac{1}{3!}z^3 + \frac{1}{5!}z^5 - \cdots $$ also has an infinite range of convergence (such functions are called entire) and hence no singularities. From here it's easy to see that the function $(\sin z)/z$ is analytic as well, with Taylor series $$ \frac{\sin z}{z} = \sum_{k=0}^{\infty} \frac{(-1)^k}{(2k+1)!}z^{2k} = 1 - \frac{1}{3!}z^2 + \frac{1}{5!}z^4 - \cdots $$

However, a function like $(\sin z)/z^3$ is not analytic at $z=0$, since dividing $\sin z$ by $z^3$ would give us the following expression: $$ \frac{\sin z}{z^3} = \frac{1}{z^2} - \frac{1}{3!} + \frac{1}{5!}z^2 - \cdots $$

But notice that if we were to subtract the term $1/z^2$ from both sides we would be left again with a proper Taylor series $$ \frac{\sin z}{z^3} - \frac{1}{z^2} = \frac{\sin z - z}{z^3} = \frac{1}{3!} + \frac{1}{5!}z^2 - \frac{1}{7!}z^4 + \cdots $$

This idea of extending the idea of Taylor series to include terms with negative powers of $z$ is what is referred to a Laurent series. A Laurent series is a power series in which the powers of $z$ are allowed to take on negative values, as well as positive: $$ f(z) := \sum_{k = -\infty}^{\infty} c_k z^k $$

In this way we can expand complex functions around singular points in a fashion similar to expanding around analytic points.

A pole, it turns out, is a singular point for which there are a finite number of terms with negative powers, such as with $(\sin z)/z^3$. If, however, an infinite number of negative powers are needed to fully express a Laurent series, then this type of singular point is called an essential singularity. An excellent example of such a function can be made by taking an analytic function (one with a Taylor series) and replacing $z$ by $1/z$: \begin{align} \sin(1/z) &= \sum_{k=0}^{\infty}\frac{(-1)^k}{(2k+1)!}(1/z)^k \\ &= \sum_{k=0}^{\infty}\frac{(-1)^k}{(2k+1)!}z^{-k} \\ &= \sum_{k=-\infty}^{0}\frac{(-1)^{-k}}{(1-2k)!}z^{k} \end{align}

These kinds of singularities are quite severe and the behavior of complex functions around such a point is rather erratic. This also explains why the real-valued function $e^{-1/x^2}$ was so pathological. The Taylor series for $e^z$ is given by $$ e^z = \sum_{k=0}^{\infty}\frac{1}{k!}z^k $$ and so \begin{align} e^{-1/z^2} &= \sum_{k=0}^{\infty}\frac{1}{k!}(-1/z^2)^k \\ &= \sum_{k=0}^{\infty}\frac{1}{k!}(-1)^k z^{-2k} \\ &= \sum_{k=-\infty}^{0}\frac{1}{(-k)!}(-1)^{-k} z^{2k}. \end{align}

Hence there is an essential singularity at $z=0$, and so even though the real-valued version is smooth at $x=0$, there is no hope of differentiability in a disk around $z=0$.


Both real analyticity and complex analyticity can be defined as "locally given by a convergent power series". In the complex case, there are several equivalent definitions that, at first sight, look quite different. Differentiability is one; the Cauchy integral formula is another. (In the world of real numbers, differentiability (even infinite differentiability) is much weaker than analyticity, and there's no analog of the Cauchy integral formula.)

Any real analytic function, say with domain of analyticity $D\subseteq\mathbb R$, is the restriction to $D$ of a complex analytic function whose domain of analyticity in $\mathbb C$ includes $D$. The proof is just to use the same power series but now with complex values of the input variable.

The material about $e^{ix}$ quoted in the question is the special case where you start with the exponential function. It's analytic on the whole real line, and so its power series expansions about various real numbers give a complex analytic extension on a domain in $\mathbb C$ that includes $\mathbb R$. The exponential function, however, is unusually nice, in that its power series expansion around a single point, for example the expansion $e^x=\sum_{n=0}^{\infty}x^n/n!$ around $0$, has infinite radius of convergence, so by plugging in complex values for $x$, we get a complex analytic function defined on all of $\mathbb C$. In particular, we can plug in a pure imaginary value $ix$ (with real $x$) in place of $x$ and get the power series $e^{ix}=\sum_{n=0}^{\infty}(ix)^n/n!$. The rest of the quoted material is just separating the real and imaginary parts of this sum. The terms with even $n$ have an even power of $i$ so they're real, while the terms with odd $n$ have an odd power of $i$ so they're purely imaginary. After separating the real and imaginary parts in this way, and factoring out $i$ from the imaginary part, one finds that the real part is just (the power series of) $\cos x$ and the imaginary part is $i\sin x$.