Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.stat.math.independent

Topic: delta method for inverse implicit function
Replies: 0  

Advanced Search

Back to Topic List Back to Topic List  
oercim@yahoo.com

Posts: 237
Registered: 5/2/05
delta method for inverse implicit function
Posted: Aug 24, 2012 9:48 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

I want to apply delta method to find variance of roots of an implicit function. However I am very confused with that.Let, f be such a function.

f(x,a)=y
E[f(x,a)]=E[y]=0,
VAR[f(x,a)]=h(a),

where x is random variable, a is parameter, and f is an implicit function of a. I want to find variance of roots of f(x,a) according to a (the variance of "a"s where f(x,a)=0).

If f was not implicit, I guess the variance approximation could be find like below:

f(x,a)=y.
f^(-1)(x,y)=g(x,y)=a
VAR(f^(-1)(x,y))=[(dg/dy)^2]*h(a)

and put "0" which is expected value of y, instead of y. Am I right till now?

However if f is implicit what will I do? Assume df/da is not implicit. Can I use inverse function theorem. How can I apply it to delta method. I am very confused. If you help me, I will be very glad. Thanks lot.

Best wishes.










Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.