Date: Dec 23, 2012 4:42 PM
Author: RGVickson@shaw.ca
Subject: Re: find minimum of a function with abs and squares analytically

On Sunday, December 23, 2012 9:08:56 AM UTC-8, richard...@gmail.com wrote:
> Hi,
>
>
>
> I want to find the analytical minimum 'x_opt=argmin(x)' of the following function:
>
>
>
> f(x) = alpha * |c + x| + beta * x ^ 2
>
>
>
> where x is a real number (x is_element_of R), c is a real constant (c is_element_of R), alpha and beta are positive real constants (alpha is_element_of R+), beta is_element_of R+), |?| is the absolute value function and ^ is the power function.
>
>
>
> Looks simple, but the absolute value function makes it somewhat tricky. As already mentioned, i want to find the solution to this minimization problem analytically, not numerically.
>
>
>
> I managed to split the optimization up according to the three cases x < -c, x = -c, x > -c, and solve each case separately analytically (by setting the first derivative to zero).
>
>
>
> For the three cases I have now the solutions x_opt = alpha/(2*beta) [for x < -c], x_opt = -c [for x = -c], and x_opt = -alpha/(2*beta) [for x > -c]. But how to 'combine' these solutions now to get the solution 'x_opt' (as a function of 'x') ?
>
>
>
> So i would need a function 'phi(x)' which delivers me 'x_opt' for a given x, 'phi(x) = argmin f(x)'. How does phi(x) look like ?
>
>
>
> thx in advance for any advice.


Optimizing in each part separately *by setting the derivative to zero* may lead nowhere. If alpha and beta are both > 0, the optimum in each separate region may be at the endpoint x = -c, where the function does not have a derivative at all. So, you should modify the procedure to say: the optimum in the region (for each separate region) is either where the derivative = 0, or is at the endpoint. The right-hand end x = -c (for the region {x <= -c} is the minimum if f'(-c-0) <= 0. The left-hand endpoint x = -c (for the region {x >= -c} is the minimum if f'(-+0) >= 0.