Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Re: Basic GLM Question
Posted:
Apr 9, 2013 9:03 AM


SChapman <sumanthcp@googlemail.com> writes:
> Why is that in Generalised Linear Model we don't predict individual > response variable values Yi? We only predict the conditional > expectation E[Yi]. > > In a general linear model (simple linear regression) however all text > books give formula for predicting Yi as well as E[Yi].
I'm jumping into this late, but hopefully not too late. Putting on my "stupid practitioner" hat, this is just because the general linear model has an easy linear relationship with its errors:
Yi = b0 + b1 X1 + ... + eps
where eps is just a draw from the normal distribution, eps ~ N(0, s^2).
For the generalized linear model, this is not true. You can say that the definition of the link function is that
E[Yi] = g^{1}(X b)
but because of the nonlinearities in the link function, this is *not* the same as saying
Yi = g^{1}(X b) + eps.
Well, in some sense it is, of course, but if you write it this way, eps is a draw from some oddball nonnormal distribution.
For example, if you look at logistic regression, it is not true that
Yi = invlogit(X b) + eps, eps ~ N(0, s^2) [not true]
This would allow for negative Yi, as written.
Does that answer your question?
Regards, Johann



