Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.


ganesh
Posts:
37
Registered:
2/15/06


Integration of O() terms of the Taylor series
Posted:
Feb 2, 2012 1:42 PM


Hello,
I have two functions say f1(?) and f2(?) as follows:
f1(?)=1/(a?^2) + 1/(b?) + O(1) ... (1)
and
f2(?)= c+d?+O(?^2) ... (2)
where ? = ?? and a,b,c,d and ? are constants. Eq. (1) and (2) are the Taylor series expansions of f1(?) and f2(?) about ? respectively. I need to integrate f1(?) and f2(?) with respect to ? (1,1). Integration is straight forward for all the terms except O(1) and O(?^2) in (1) and (2) respectively. How do I proceed here to integrate the O() terms? If anyone can guide me on this it will be extremely helpful. Many thanks for the help. Regards, N.



