I have two functions say f1(?) and f2(?) as follows:
f1(?)=1/(a?^2) + 1/(b?) + O(1) ... (1)
f2(?)= c+d?+O(?^2) ... (2)
where ? = ?-? and a,b,c,d and ? are constants. Eq. (1) and (2) are the Taylor series expansions of f1(?) and f2(?) about ? respectively. I need to integrate f1(?) and f2(?) with respect to ? (-1,1). Integration is straight forward for all the terms except O(1) and O(?^2) in (1) and (2) respectively. How do I proceed here to integrate the O() terms? If anyone can guide me on this it will be extremely helpful. Many thanks for the help. Regards, N.