The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » sci.math.* » sci.math.num-analysis

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Optimization problem
Replies: 1   Last Post: Dec 5, 1996 10:47 AM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View  
Peter Spellucci

Posts: 630
Registered: 12/7/04
Re: Optimization problem
Posted: Dec 5, 1996 10:47 AM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

In article <57gjs0$>, (Au_Yeung_Man_Ching) writes:
|> Hi,
|> I would like to ask some questions:
|> (1) Given the following problem,
|> min J(q,S)
|> q,S
|> in doing the above problem, we do the
|> above problem like this:
|> min min J(q,S)
|> q S
|> i.e. doing the problem sequentially as two
|> parallel independent minimization problem.
|> With a fixed q first, minimize J(q,S) w.r.t.
|> S to obtain S*. Then, minimize J(q,S*) w.r.t.
|> q. Finally, get the optimal point(local/global??)
|> (q*,S*).
|> IF CANNOT, how can we do?
if J is convex in the joined variable (q,S) this works and is
known for a long time (blockwise coordinate descent).
the classical book of blum&oettli 'mathematical optimization (or similar title)'
which appeared with Springer publisher is a good source.
Otherwise you migth try to solve grad_{q,S}J=0 by solving (analytically
or numerically) a subsystem for S=S(q) plug this into the remaining equations
and solve these for q (But I am afraid that this will not work well globally
even if your problem is quite well behaved but nonconvex)
hope this helps

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.