This is the mail archive of the
gsl-discuss@sources.redhat.com
mailing list for the GSL project.
Re: High-dimensional Minimization without analytical derivatives
- From: James Theiler <jt at lanl dot gov>
- To: John Lamb <J dot D dot Lamb at btinternet dot com>
- Cc: gsl-discuss at sources dot redhat dot com
- Date: Sun, 5 Sep 2004 15:07:54 -0600 (MDT)
- Subject: Re: High-dimensional Minimization without analytical derivatives
On Sat, 4 Sep 2004, John Lamb wrote:
] Calling simulated annealing SA is also potentially problematic because
] there is a Stochastic Approximation algorithm due to Keifer and
] Wolfowitz that is commonly known as SA. It can be used on nonstochastic
] problems by adding noise and has guaranteed convergence to a global
] optimum, albeit the convergence is often too slow for SA to be practical.
Just a minor point, but SA (stochastic approximation) does not
guarantee convergence to a global minimum. There are theorems
about convergence of SA (simulated annealing) to a global minimum,
but that convergence is arguably too slow to be practical.
My own opinion[*] is that the only practical way to guarantee
convergence to the global minimum is to make sure you are minimizing a
convex function -- that may not be saying a lot, since for a convex
function, there is only one local minimum and it is the global
minimum. And it certainly doesn't help if the function you need to
minimize doesn't happen to be convex. Sorry if this is getting afield
of discussions about how to use GSL.
jt
[*] whenever you combine "practical" and "guarantee" into a sentence,
I think you are inevitably in the realm of opinion.
--
James Theiler Space and Remote Sensing Sciences
MS-B244, ISR-2, LANL Los Alamos National Laboratory
Los Alamos, NM 87545 http://nis-www.lanl.gov/~jt