On 10-10-28 09:38 PM, firstname.lastname@example.org wrote: > elgen<email@example.com> wrote: >> I have a question on the least-square optimization with a complex >> residual function. The residual function is r(z_1, z_2), in which z_1 >> and z_2 are complex variables. > [...] >> In my case r(z_1, z_2) is a complex function. If I use the Euclidean >> norm (conjugated inner product), the cost function becomes >> >> \sum_i conj(r)r > > So your resid is real now? OK. Change your mind, that's alright. :) > >> I am stuck on how to calculate the gradient of this cost function as >> conj(r) is not an analytic function and the gradient needs to take the >> derivative with respect to z_1 and z_2. > > Ahhh. At worst you can treat the resid as 2 SoS -- the real parts of r and > the imag parts of r. > > For more general resid functions maybe think in terms of Euclid form. > >
I understand that you refer residual to
in my case.
How would I proceed to calculate its gradient? Would you mind being more specific? What is "SoS"?