Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.

Topic: advise on non-linear optimization codes
Replies: 6   Last Post: Dec 28, 2011 3:26 AM

 Messages: [ Previous | Next ]
 Eli Osherovich Posts: 37 Registered: 6/5/05
Re: advise on non-linear optimization codes
Posted: Dec 27, 2011 2:16 AM

On 12/26/2011 06:33 PM, Peter Spellucci wrote:

> (from the second posting: f is nonconvex)
> If you don't have also an application function with implements
> ''apply A(transpose) to x)'' then you are unhappy from the beginning,
> since
> This then points us to ''derivative free codes''.
> Nonconvexity isn't a problrem from the algorithmic point of view:
> it restricts us to local solutions (and as always, we ''hope'' to get
> the global one, maybe using some additional information, maybe
> by resorting to stochastic restarts or what soever.)
> But there are more points to consider: to didn't say nothing about
> the algorithmic description of R1 and R2.
> if these are boxes, well, then becomes simple, otherwise, with the
> description by some convex functions as lower and concave functions
> as upper bounds, more work has to be done.
> In short: if are are forced to use derivative free codes, then
> there remains the combination of Powells NEWUOA (for box constrained
> nonlinear minimization) with the augmented Lagrangian approach a la
> LANCELOT/GALAHAD. Therec is a 2010 paper of
> Diniz-Ehrhardt, martinez and Pedroso
> (via optimization-online)
> ''Derivative-free methods for nonlinear programming with
> general lower level constraints''
> (lower level means box here)
> NEWUOA works up to some hundred variables (I used it with n=300 for example),
> but beyond n=1000 this is not a choice because the special form of
> its interpolation scheme.
> There is also ''DFO'' by Toint, Conn and Scheinberg, but the authors themselves
> state that this should not be used beyond dimenasion 100, and more worse, you
> will need an additional nlo code which is not free .
> https://projects.coin-or.org/Dfo
> For some more see
> http://plato.asu.edu/guide.html
> hth
> peter
>

Peter,

I indeed provided too little details in the original post.
Here I will try to expand it.

I do have A' multiplication operator and have no problem to calculate
the gradient. At the moment I use the L-BFGS method to solve my
problem, however, I feel that that the constraints, which are
implemented as simple penalty functions, can be addressed in a better
way. Before I describe the constraints, it is worthwhile to mention
that I can also implement the Hessian-vector product, if it may be
useful in optimization routine.

The constraints on x are, in general, of two types:

1) x = 0 for some known indexes (it does not mean that the rest is
non-zero)
2) x >=0. This constraint is valid in certain situations, but in general
x is complex.

At the moment, I simply eliminate the locations where x is known to be
zero.

Constraints on Ax (which is complex) are as follows

1) a <= angle(Ax) <= b
2) |Ax| <= r

Date Subject Author
12/26/11 Eli Osherovich
12/26/11 Eli Osherovich
12/26/11 Peter Spellucci
12/27/11 Eli Osherovich
12/27/11 Eli Osherovich
12/27/11 Peter Spellucci
12/28/11 Eli Osherovich