This is the mail archive of the gsl-discuss@sources.redhat.com mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: minimization using the Brent algorithm (brent.c)


Stated like this, I mostly agree with you, except maybe on vocabulary. What is called the Brent algorithm or the Golden section search is exactly what is (currently) implemented in GSL. But as Michael and you pointed out, in practical situation, we need a way to start the algorithm. The rationnal why this is not included into GSL (well, it once was in my first implementation of multidimensional minimization) is (to my mind) that it is highly problem dependant. Nevertheless, this discussion seems to show that a general solution (maybe suboptimal) is needed.

By the way a simple example why this is problem dependant: take multidimensional minimization. In general, you have to perform a line search at each step. As you want something fully automated, in some cases you have to provide the now famous middle point! For instance, when you are working with conjugate gradient algorithms, you must have a correct (not very precise, but still correct) estimation of the minimum in the line search. It is a good idea to rely on the Brent algorithm to do this line search, but as you have evaluated the gradient at the starting point, you can use this information to find the middle point more efficiently that you would do in the general case (in which you cannot assume knowledge of the derivative). Moreover, in some particular applications (for instance neural networks) and under particular circonstances, you can even more tune your algorithm, for instance be reusing results from previous line searches.

Fabrice Rossi


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]