Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



delta method for inverse implicit function
Posted:
Aug 24, 2012 9:48 AM


I want to apply delta method to find variance of roots of an implicit function. However I am very confused with that.Let, f be such a function.
f(x,a)=y E[f(x,a)]=E[y]=0, VAR[f(x,a)]=h(a),
where x is random variable, a is parameter, and f is an implicit function of a. I want to find variance of roots of f(x,a) according to a (the variance of "a"s where f(x,a)=0).
If f was not implicit, I guess the variance approximation could be find like below:
f(x,a)=y. f^(1)(x,y)=g(x,y)=a VAR(f^(1)(x,y))=[(dg/dy)^2]*h(a)
and put "0" which is expected value of y, instead of y. Am I right till now?
However if f is implicit what will I do? Assume df/da is not implicit. Can I use inverse function theorem. How can I apply it to delta method. I am very confused. If you help me, I will be very glad. Thanks lot.
Best wishes.



