So, I've tried comparing the hessian given by fminunc with two different others. I tried multiplying the function jacobian with its transpose. (I expect it to be equal to the hessian in a chi^2 minimization problem, near the minimum). If I take the ratio of this hessian to the fminunc one I get a matrix of ones, with a 0.1% agreement in the worst case, usually much better. And that's very fine. (I've tried with the same fitting function but with independent datasets)
Then, suspecting that fminunc uses the same approximation, I also tried to estimate the hessian without approximations but using finite differences. If I do so and compute the ratio of this hessian to one of the other two I get a matrix of values spread around 1.... (-20%,+50%!!!!) Which is not fine!! I wonder which method is correct now, as I know that computing second derivatives with finite differences is not trivial.