Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.math

Topic: distribution of regression coefficients
Replies: 9   Last Post: Nov 12, 2010 11:58 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Paul

Posts: 162
Registered: 12/7/09
Re: distribution of regression coefficients
Posted: Nov 11, 2010 10:29 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On Nov 11, 5:45 am, "Rod" <rodrodrod...@hotmail.com> wrote:
> "Ray Koopman" <koop...@sfu.ca> wrote in message
>
> news:cc3b56b7-ce12-4f8c-8a9a-c5f2592f6e8b@n24g2000prj.googlegroups.com...
>
>
>

> > On Nov 10, 4:07 am, "Rod" <rodrodrod...@hotmail.com> wrote:
> >> On Nov 10, 2:20 am, Ray Koopman <koop...@sfu.ca> wrote:
> >>> On Nov 10, 12:44 am, "Rod" <rodrodrod...@hotmail.com> wrote:
>
> >>>> In regression y = a + b*x
>
> >>>> I know how to compute the covariance matrix for a and b.
> >>>> I also know that a and b are normally distributed,
> >>>> but what is the joint distribution of a and b?
> >>>> Its tempting to guess bivariate normal
> >>>> but I don't see how to show that.

>
> >>> y = X beta + e
>
> >>> W = (X'X)^-1 X'
>
> >>> b = Wy
> >>>   = beta + We

>
> >>> If e is multivariate normal then so is We, and hence b.
>
> >> ditto a, but what of the joint distribution given that a and b are
> >> correlated?

>
> > Sorry, I should have been more explicit (and used only low-ascii
> > characters). In what I wrote,  X  is a given  n  by  p  matrix
> > of predictors, where  n  is the # of cases and  p  is the # of
> > predictors,  beta  is a p-vector of unknown coefficients, and  e  is
> > a random n-vector. If one of the columns of  X  is a dummy predictor
> > whose value is  1  for every case then the corresponding element in
> > beta  is the intercept (your  a ). So the intercept is "just another
> > coefficient".

>
> > Whatever the distribution of  e  may be, if its mean vector and
> > covariance matrix are  m  and  S  then the mean vector and covariance
> > matrix of  b  are  beta + Wm  and WSW'. (Note: we usually assume
> > m = [0,...,0]'.)

>
> > If  e  is multivariate normal then  b  is also multivariate normal.
>
> I rather hastily assumed my b was the same as yours, sorry.
> OK I get it that if the e are normal then b is just a linear combination and
> hence also normal.
> Either I am not understanding what you are saying (likely), or you haven't
> yet answered my question fully.


It's the former.

> To keep it simple lets keep the e normal and independent from each other.
> Also let me return to my y=a+bx notation.
> I am after the joint probability P(a,b) which because a and b are correlated
> is different to the product of the two distributions for a and b separately.
> I would put money on P being bivariate normal but for the life of me I can't
> see how to work that out.


As Ray said, b = \beta + We, where W is computed from the X matrix.
The theoretical variance-covariance matrix of b is E[(b-\beta)(b-
\beta)'] = E[Wee'W']. Treat X, and therefore W, as constant with
respect to the expectation. Since the e are assumed i.i.d. with zero
mean and variance sigma^2, E[ee'] should be obvious (left to the
reader as an exercise).

/Paul




Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.