Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Optimization problem
Replies: 1   Last Post: Dec 5, 1996 10:47 AM

 Peter Spellucci Posts: 630 Registered: 12/7/04
Re: Optimization problem
Posted: Dec 5, 1996 10:47 AM

In article <57gjs0\$dad@eng-ser1.erg.cuhk.hk>, mcau@ee.cuhk.hk (Au_Yeung_Man_Ching) writes:
|> Hi,
|>
|> I would like to ask some questions:
|>
|> (1) Given the following problem,
|>
|> min J(q,S)
|> q,S
|>
|> in doing the above problem, we do the
|> above problem like this:
|>
|> min min J(q,S)
|> q S
|>
|> i.e. doing the problem sequentially as two
|> parallel independent minimization problem.
|> With a fixed q first, minimize J(q,S) w.r.t.
|> S to obtain S*. Then, minimize J(q,S*) w.r.t.
|> q. Finally, get the optimal point(local/global??)
|> (q*,S*).
|>
|> CAN WE DO THE PROBLEM AS DESCRIBED ABOVE??
|> IF CANNOT, how can we do?
|>
if J is convex in the joined variable (q,S) this works and is
known for a long time (blockwise coordinate descent).
the classical book of blum&oettli 'mathematical optimization (or similar title)'
which appeared with Springer publisher is a good source.
Otherwise you migth try to solve grad_{q,S}J=0 by solving (analytically
or numerically) a subsystem for S=S(q) plug this into the remaining equations
and solve these for q (But I am afraid that this will not work well globally
even if your problem is quite well behaved but nonconvex)
hope this helps
peter