When Functions Are Equal to Their Taylor Series

So far we have assumed that we could find a power series representation for functions.  However, some functions are not equal to their Taylor series, i.e. are not analytic.  How can we tell which are and which aren't?

We note that $f(x)$ is equal to it's Taylor series if $\displaystyle\lim_{n\to\infty}T_n(x)=f(x)$, i.e. the series converges to the limit of its partial sums. 

We define the remainder of the series by $R_n(x)$, with $R_n(x)=f(x)-T_n(x)$.  Then $f(x)=T_n(x)+R_n(x)$.  We can see by this that a function is equal to its Taylor series if its remainder converges to 0; i.e., if a function $f$ can be differentiated infinitely many times, and $$ \lim_{n\to\infty}R_n(x)=0, $$ then $f$ is equal to its Taylor series.

We have some theorems to help determine if this remainder converges to zero, by finding a formula and a bound for $R_n(x)$.

(Remainder) Theorem:  Let $f(x)=T_n(x)+R_n(x)$.  If $f^{(n+1)}$ is continuous on an open interval $I$ that contains $a$ and $x$, then $$R_n(x)=\frac{f^{(n+1)}(z)}{(n+1)!}(x-a)^{n+1}$$for some $z$ between $a$ and $x$.

Taylor's Inequality: If the $(n+1)$st derivative of $f$ is bounded by $M$ on an interval of radius $d$ around $x=a$, then
 $$\lvert R_n(x)\rvert\le\frac{M}{(n+1)!}(x-a)^{n+1}.$$

From this inequality, we can determine that the remainders for $e^x$ and $\sin(x)$, for example, go to zero as $k \to \infty$ (see the video below), so these functions are analytic and are equal to their Taylor series.

The Remainder Theorem is similar to Rolle's theorem and the Mean Value Theorem, both of which involve a mystery point between $a$ and $b$.  The proof of Taylor's theorem involves repeated application of Rolle's theorem, as is explained in this video.