The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » sci.math.* » sci.stat.math

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Basic GLM Question
Replies: 1   Last Post: Apr 9, 2013 9:03 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View  
Johann Hibschman

Posts: 5
Registered: 4/16/07
Re: Basic GLM Question
Posted: Apr 9, 2013 9:03 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

SChapman <> writes:

> Why is that in Generalised Linear Model we don't predict individual
> response variable values Yi? We only predict the conditional
> expectation E[Yi].
> In a general linear model (simple linear regression) however all text
> books give formula for predicting Yi as well as E[Yi].

I'm jumping into this late, but hopefully not too late. Putting on my
"stupid practitioner" hat, this is just because the general linear model
has an easy linear relationship with its errors:

Yi = b0 + b1 X1 + ... + eps

where eps is just a draw from the normal distribution, eps ~ N(0, s^2).

For the generalized linear model, this is not true. You can say that
the definition of the link function is that

E[Yi] = g^{-1}(X b)

but because of the nonlinearities in the link function, this is *not*
the same as saying

Yi = g^{-1}(X b) + eps.

Well, in some sense it is, of course, but if you write it this way,
eps is a draw from some oddball non-normal distribution.

For example, if you look at logistic regression, it is not true that

Yi = invlogit(X b) + eps, eps ~ N(0, s^2) [not true]

This would allow for negative Yi, as written.

Does that answer your question?


Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.