On Sep 29, 3:56 pm, "Luis A. Afonso" <lic...@hotmail.com> wrote: > When a problem is ill posed it is difficult to get what it is really asked for. > Suppose I generate data from a NORMAL MODEL N(mu, var). Considering any finite set N of values we know that, generally speaking, the sample mean, m, does not coincide with mu. > _______ mu = m + bias > This bias tends to ZERO when N grows ro infinity. > The same way in what concerns variance and its unbiased estimator, the sample variance, svar, which?s the sum of the squared deviations divided by N-1. > ________ svar = var + bias´
I don't know what question you answered here but I will disagree with you here again. What you are describing here is not a bias, it is a random error. If N is a random sampling from the normal model you specified, calculated sample mean contains a random error not a bias because error is just related to samples you selected, not to a systematic mistake you made in your experiment.