I'm working through a power series problem from calc 3, and came across something that I can't explain.
I want to find the Taylor series expansion and radius of convergence for f(x) = x /[ 1 + 2x^2]
So I factor x out of the expression and work on getting power series of g(x) = 1/(1+2x^2)
The one way to do this is to write it as [ 1 + 2x^2 ] ^(-1) = [1 - (-2x^2) ]^(-1) and then write out the geometric series whose terms are powers of r = (-2x^2). The condition for the convergence of a geometric series | 2x^2| < 1, let's me compute the radius of convergence as 1/sqrt(2). Taylor series is then x*Power series for g(x)
BUT, when I do this problem a different way, I get a different solution: g(x) = 1/ [1+2x^2] = -1/[-1-2x^2] = -1/[1-2(1+x^2)] but there is no value of x which satisfies the necessary condition to write this as a geometric series, i.e. 2(1+x^2) < 1 is satisfied for no value of x.
Not quite sure why I cannot get the power series to work out this way.