Newton's law requires two initial conditions while the Taylor series requires infinite!

On the other hand, it requires only two initial conditions x(0) and x˙(0), to obtain the function x(t) by solving Newton's equation

For notational simplicity, let

$$x_0 = x(0)$$ $$v_0 = \dot x(0)$$

and then write your equations as

$$x(t) = x_0 + v_0t + \ddot x(0)\frac{t^2}{2!} + \dddot x(0)\frac{t^3}{3!} + \cdots$$

$$m\ddot x(t) = F(x,\dot x,t)$$

Now, see that

$$\ddot x(0) = \frac{F(x_0,v_0,0)}{m}$$

$$\dddot x(0) = \frac{\dot F(x_0,v_0,0)}{m}$$

and so on. Thus

$$x(t) = x_0 + v_0t + \frac{F(x_0,v_0,0)}{m}\frac{t^2}{2!} + \frac{\dot F(x_0,v_0,0)}{m}\frac{t^3}{3!} + \cdots$$

In other words, the initial value of the 2nd and higher order time derivatives of $x(t)$ are determined by $F(x,\dot x, t)$.


FGSUZ has given part of the answer in his comment, but he has not given full details.

Consider $\ddot{x} (t)=F(x,\dot{x},t)$. In this case you have the second derivative in terms of lower order terms. You can therefore use this to remove the second derivative in favor of lower order items.

You can then take the time derivative of this equation. This will give you the third order time derivative of $x$ in terms of lower order derivatives. And you can use the first equation and its derivative to write everything in terms of at most the first derivative.

So, order by order, you can construct the Taylor expansion.

Now the general case may require you to deal with derivatives of $F(x,\dot{x},t)$. That is because you need the following (if I've recalled my calculus correctly).

$$\frac{d^3 x}{dt^3}=\dot{F}(x,\dot{x},t)= \frac{\partial}{\partial x}F(x,\dot{x},t) \frac{dx} {dt} + \frac{\partial}{\partial \dot{x}}F(x,\dot{x},t) \frac{d\dot{x}} {dt} +\frac{\partial} {\partial t}F(x,\dot{x},t)$$

This will often not be explicitly solvable. However, it also can be Taylor expanded in a similar fashion. And, at each order you keep only the corresponding order in the expansion of this equation.

So, order by order, you can construct the Taylor series. At each step you can use the equation of motion to remove all but the $x$, $\dot{x}$, and $t$ dependence. And so you will only need two initial conditions. Tedious, but possible.

The nice cases are those few where you can derive a simple formula that gives an easy recursion. So you might, for simple forms of $F$, get some simple thing that the $(n+1)$ derivative is some simple function of the $n$ derivative. In such cases, it is potentially useful in numerical solutions, since you can write things in terms of the time step and a nice Taylor expansion. Though, even in such cases, there are often more efficient methods.


Power series expansion does not hold for all functions $f(t)$ or for all $t\in\mathbb{R}$, but only for real analytic functions and for $t$ in the radius of convergence. In particular, it does not hold at any point e.g. for functions $C^2(\mathbb{R},\mathbb{R}^d)\smallsetminus C^3(\mathbb{R},\mathbb{R}^d)$. Therefore it is not possible to define any function by giving countably many real numbers $(x^{(n)}(0))_{n\in\mathbb{N}}$.

In particular, Newton's equation may have solutions in $C^2(\mathbb{R},\mathbb{R}^d)\smallsetminus C^3(\mathbb{R},\mathbb{R}^d)$, that therefore do not admit a power series expansion, or in general solutions that are not real analytic for all times, and therefore that do not always admit a Taylor expansion. Nonetheless, these functions are uniquely defined by two real numbers ($x(0)$, $\dot{x}(0)$) and by being solution of Newton's equation (i.e. they are also determined by $m$ and the functional form $F$ of the force).

In case that a solution of the Newton equation is real analytic, then the value of the higher order derivatives in zero is determined uniquely by the solution itself, and thus they also depend only on $x(0)$, $\dot{x}(0)$, $m$ and $F$; no further knowledge is required.