What is the interval of convergence of the Taylor series of f(x)=1/(1-x)?

1 Answer
Mar 28, 2018

The Taylor Series of f(x)=1/(1-x) centered at a=0 is

sum_(n=0)^oox^n=1+x+x^2+x^3+cdots

which converges for -1<x<1

Explanation:

The general formula for the Taylor Series of f(x) centred at a is

sum_(n=0)^oo (f^((n))(a))/(n!)(x-a)^n

Let's find the general form for the nth derivative of f(x)=1/(1-x)

f'(x)=1/(1-x)^2

f''(x)=2/(1-x)^3

f'''(x)=(2*3)/(1-x)^4

We can see the pattern emerging.

The formula appears to be

f^((n))(x)=(n!)/(1-x)^(n+1)

For simplicity, let's centre our Taylor Series at a=0

Then f^((n))(a) is:

f^((n))(0)=(n!)/(1-0)^(n+1)=n!

and the Taylor Series becomes:

sum_(n=0)^oo (n!)/(n!)(x-0)^n=sum_(n=0)^oox^n

Now that we have our series, let's use the ratio test to check for convergence.

lim_(n->oo)abs(x^(n+1)/x^n)=abs(x)

Therefore the series converges for abs(x)<1

Which can be expressed as -1<x<1

Let's check the endpoints of this interval.

Check x=-1

rArrsum_(n=0)^oo(-1)^n=1-1+1-1+cdots-cdots

rarr Does not converge.

Check x=1

rArrsum_(n=0)^oo1^n=1+1+1+1+cdots

rarr Diverges to +oo

So the final answer is:

sum_(n=0)^oox^n converges for -1<x<1