I agree with you and R, in my second message I've tryed to explain that R shows me the right results.
My difficulties are: understand how multicollinearity affects the regression analysis and how it's related with a computational problems (like ill and possed X'X matrix) and statistical problem (like with a "small" change in predictors data may arrive to very different results in the model). I know the definitions eigenvalue, singular matrix and condition number but I'm trying to understand implications in the statistics area. Again, thanks for answer!
On 7 nov, 02:26, David Winsemius <doe_s...@comcast.n0T> wrote: > Dear DP; > > You _should_ get an error when you try to invert a singular matrix. R is > behaving correctly. > > When I run you your data through the function lm() the output contains > the line: > "Coefficients: (1 not defined because of singularities)". > Again R is giving appropriate warnings. > > > str(x.df) > > 'data.frame': 4 obs. of 4 variables: > $ V1: num 1 1 1 1 > $ V2: num 2 4 6 8 > $ V3: num 3 6 9 12 > $ y : num 3 6 9 12 > > > xdf.mdl<-lm(y ~ V2+V3,data=x.df) > > summary(xdf.mdl) > > Call: > lm(formula = y ~ V2 + V3, data = x.df) > > Residuals: > 1 2 3 4 > 3.680e-16 -6.134e-16 1.227e-16 1.227e-16 > > Coefficients: (1 not defined because of singularities) > Estimate Std. Error t value Pr(>|t|) > (Intercept) 1.110e-15 6.374e-16 1.742e+00 0.224 > V2 1.500e+00 1.164e-16 1.289e+16 <2e-16 *** > V3 NA NA NA NA > > I do not understand what difficulties you are having because you are not > producing the output that you feel is incorrect. It appears you may need > to further study the meanings of "singular", "eigenvalue", and > "condition number". > > -- > David Winsemius