So far we have assumed that we could find a power series
representation for functions. However, some functions are not equal to their Taylor
series, i.e. are not
analytic. How can we tell which are and which
aren't?
We note that f(x) is equal to it's Taylor series if
limn→∞Tn(x)=f(x), i.e. the series
converges to the limit of its partial sums.
We define the remainder of
the series by Rn(x), with Rn(x)=f(x)−Tn(x). Then
f(x)=Tn(x)+Rn(x). We can see by this that a function is equal to its Taylor series if
its remainder converges to 0; i.e., if a function
f can be differentiated infinitely many times, and limn→∞Rn(x)=0,
then f is equal to its Taylor
series.
We have some theorems to help determine if this remainder
converges to zero, by finding a formula
and a bound for Rn(x).
(Remainder) Theorem: Let
f(x)=Tn(x)+Rn(x). If f(n+1) is continuous
on an open interval I that contains a and x, then
Rn(x)=f(n+1)(z)(n+1)!(x−a)n+1
for some
z between a and x.
Taylor's Inequality: If the (n+1)st derivative
of f is bounded by M on an interval of radius d
around x=a, then |Rn(x)|≤M(n+1)!(x−a)n+1.
From this inequality, we can determine that the remainders for
ex and sin(x), for example, go to zero as k→∞
(see the video below), so these functions are analytic and are
equal to their Taylor series.
The Remainder Theorem is similar to Rolle's theorem and the Mean
Value Theorem, both of which involve a mystery point between a
and b. The proof of Taylor's theorem involves repeated
application of Rolle's theorem, as is explained in this video.