On what interval does a Taylor series approximate (or equal?) its function?

Let $f$ be an infinitely differentiable function, and let $T(x) = \sum_{n=0}^{\infty} a_n x^n$ be its Taylor series (say at $x = 0$).

1) The Taylor series of $f$ need not converge at any point other than $x = 0$. Indeed, by a famous Theorem of Borel, for any sequence $\{a_n\}_{n=0}^{\infty}$ of real numbers whatsoever, there exists an infinitely differentiable function with Taylor series equal to $\sum_{n=0}^{\infty} a_n x_n$. If the $a_n$ grow too rapidly -- e.g. $a_n = n!$ -- then there will be convergence only at $x = 0$.

2) Even if the Taylor series has a positive radius of convergence $R$, there is no guarantee that $f(x)$ and $T(x)$ must be equal on $(-R,R)$. A function with this property is said to be analytic at $x = 0$. (It should be mentioned that most of the familiar functions one encounters in calculus are analytic.) Perhaps the simplest example of a non-analytic function is $f(x) = e^{\frac{-1}{x^2}}$ (and $f(0) = 0$), for which $f$ and all of its derivatives vanish at $x = 0$, so $T(x) = 0$. But $f(x)$ is clearly positive at any $x \neq 0$.

Added: To show that a function is equal to its Taylor series in some interval $I$, one has to show, for each $x$ in $I$, that the remainder function $R_n(x) = f(x) - T_n(x) = f(x) - \sum_{k=0}^n \frac{f^{(k)}(0)}{k!} x^n$ approaches $0$ as $n$ approaches infinity. The most useful basic tool for this is Taylor's Formula for the Remainder. The issue comes down to having a good understanding of the growth of the derivatives at $0$ as $n$ increases. For instance, if there exists a fixed $M$ such that $|f^{(n)}(a)| \leq M$ for all $a \in I$, then Taylor's Formula immediately implies that $R_n(x) \rightarrow 0$ for all $x \in I$. This is the case for instance for $f(x) = \cos x, \sin x, e^x$ (in the last case we need to assume that the interval $I$ is bounded, although we can take it to be bounded as large as we want, so the eventual conclusion will be that $e^x$ is equal to its Taylor series on the entire real line). In general this can be a hard problem, for instance because the conclusion need not be true! Books on advanced calculus / elementary real analysis will have some worked examples.

More Added: Gosh, for a few minutes there I forgot about the thousand pages or so of lecture notes I have online! In particular see here for some further details on applying Taylor's Formula with Remainder. However this discussion is at a somewhat higher level than strictly necessary (it came from a second semester undergraduate real analysis course). As it happens, starting in January I'll be teaching a sophomore-level course on sequences and series, so in the fullness of time I might have more detailed notes. Anyway, I'm sure that a little googling will find plenty of better notes on this topic...(In fact if you flip towards the back of a good calculus book, you'll certainly find this material there.)

Yet More Added: In response to a direct request, here are the lecture notes generated by my teaching the sophomore level course on sequences and series. The Chapters labelled 0 and 1 were not actually used for the course and are probably wildly inappropriate for most American undergraduate courses on the subject. Chapters 2 and 3 seem (to me, anyway) much more on point, and in fact much of Chapter 3 is a reworking of the older set of noted I linked to above.


Since you speak about intervals (on the real line), perhaps it should also be mentioned that the "natural habitat" for power series is really the complex plane; computing a power series involves only +, -, *, /, and limits, which are well defined operations on complex numbers. And for so-called "complex analytic" (or "holomorphic") functions, which includes most functions that you encounter in calculus, it is a fact that the Taylor series at any point in the complex plane is convergent (equal to the function) in a circle around that point. The size of this circle is such that it exactly reaches out to the nearest singularity of the function. (Circles in the complex plane are the counterparts to intervals on the real line in this context.)

A simple example is $f(x)=1/(1+x^2)$. If you just look at the graph of this function (for $x$ real) it looks perfectly nice, and there seems to be no reason why the Taylor series at $x=0$ only manages to converge to the function in the interval $(-1,1)$. But if you think of $x$ as a complex variable, it's clear that there are singularities (division by zero) at the points $x=\pm i$, which lie at distance one from the origin, and that explains why the Taylor series converges inside the circle with radius one. (Note that the intersection of this disk with the real axis is just the interval $(-1,1)$.)

The same thing goes for $g(x)=\arctan x$. Its derivative is $f(x)$ above, and where the derivative has singularitites, the function has too. So the Taylor series for $g(x)$ at $x=0$ is also convergent inside the unit circle.


No, in fact the Taylor series may not even converge to the function. Consider

$$f(x) = \left\{ \begin{array}{ll} f(x)=e^{-1/x^2} & \mbox{if } x \ne 0 \\ f(x)=0 & \mbox{if } x = 0 \end{array} \right. $$