On Nov 21, 8:13 pm, djh <halitsk...@att.net> wrote: > In a different thread, Ray Koopman explained that if one > suspects these regressions to be dependent on the IV ?u?: > > c on u > c on e > c on (e,u) > > then under the usual initial assumption that the dependence > is linear, these three regressions should be modified to: > > c on (u, u^2) instead of c on u > c on (e, u, u*e) instead of c on e > c on (e, u, u*e, u^2) instead of c on (e,u)
In this post I want to talk about only your second case, where the d.v. is a bilinear function of two i.v.s:
y = a0 + a1*x1 + a2*x2 + a3*x1*x2.
(As before, I use the usual generic variable names so as not to get caught up in any peculiarities of your particular variables.)
The first thing to notice is that the model is symmetric in x1 and x2. You can think of this as a linear regression of y on x1:
y = A0 + A1*x1, where A0 and A1 are linear functions of x2.
A0 = a00 + a02*x2, A1 = a10 + a12*x2, giving
y = (a00 + a02*x2) + (a10 + a12*x2)*x1
= a00 + a10*x1 + a02*x2 + a12*x1*x2
= a0 + a1 *x1 + a2 *x2 + a3 *x1*x2.
Or you can interchange x1 and x2, thinking of the linear regression of y on x2, with the coefficients being linear functons of x2. It is a conceptual distinction with no mathematical difference.
There are two average slopes. Call them Av1 and Av2:
dy/dx1 = a1 + a3*x2 ==> Av1 = a1 + a3*mean_x2
dy/dx2 = a2 + a3*x1 ==> Av2 = a2 + a3*mean_x1
To estimate the coefficients, give your regression program three predictors: x1, x2, and x3 = x1*x2. Then