I won't pretend to understand the whole problem, but I think I can add a warning about one part. Which may be relevant, or not.
On Tue, 25 Jan 2011 07:45:23 -0800 (PST), Mike Lacy <email@example.com> wrote:
> >In the context of my question about ways to express P[A | B or C)]: > >On Jan 21, 9:18 am, Henry <s...@btinternet.com> wrote: >> On Jan 21, 3:44 pm, "danhey...@yahoo.com" <danhey...@yahoo.com> wrote: > >> >> > I am now confused about what your problem is. I'm afraid I can't help.- Hide quoted text - > >Sorry to be slow getting back to the conversation, and I don't blame >you for your confusion; I now find myself confused by what I said:-}. > >Thanks to all for the responses. Let me try a somewhat different >presentation of the question and see if it makes any better sense: > >If one wants to do a simulation of the p-value for a test that a >bivariate regression slope is 0 for the regression of y on x, the >conventional permutation approach would be to shuffle x w.r.t. y some >large number of repetitions, counting the number of times that the >observed estimated slope > 0. So, we are empirically estimating P(A| >B), where A is "observed slope >=0" and B is "population slope is 0." >B here is not a random variable, but rather is imposed as a condition >of the simulation.
This part -
> Now, I want to extend a simulation like this to >the product of two regression coefficients, where there is now more >than one way in which the null hypothesis might be true.
Is "the product" an intentional way to combine two coefficients and hypotheses, or is it arbitrary? It is not the same test, say, as the sum of the absolute values. It is not the same as ordinary ways of combining two hypotheses - which would ignore the actual values of the coefficients. And there are various ways to consider two hypotheses simultaneously, depending on how much you want to reward extreme versus non-extreme outcomes.
Also - If you use the actual values of the ordinary regression coefficients, you either assume that the variances are equal, or you agree to unequal weights.
-- Rich Ulrich [rest of post]
> So, if event >A = "product of the two sample regression slopes = 0," and B = "the >population value of the first of these slopes is 0," and C = >"population value of the second of these slopes = 0," I would >summarize the question of interest as P[A | (B or C)]. It is possible >to impose B = 0 and conduct a simulation, or C = 0, or B = C = 0. But >these are all, to my view, imposed conditions of the simulation, and >not random variables. Hence, I had the idea to simulate the three >probabilities of interest P(A | B), P(A|C), and P(A | (B and C)), and >try to combine them afterward, using some algebraic manipulation. I >arrived, as did other folks, at expressions involving P(B) and P(C), >but since B and C are imposed conditions rather than random variables, >I can't see this making sense. > >I hope this is clearer; my apologies if it isn't. And don't feel >compelled to respond; I thought I would at least try to give >responders a sense of what I had in mind. > >Regards, >Mike Lacy