This is the mail archive of the gsl-discuss@sources.redhat.com mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

multifit/Levenberg-Marquardt


Hi,

 I have been using GSL for my work on image deconvolution.  I need to
 use the Levenberg-Marquardt algorithm for Non-linear minimization.
 However the problem I am solving involves large data size as well as a
 large no. of parameters.  
 
 If I use the GSL implementation, I will have to allocated the
 Jacobian which is of the size NxP were N is the data size and P the
 no. of parameters.  For me N=1024x1024 or more and P few hundred.
 Hence, holding the entire Jacobian in the memory is impractical.
 Also, it's most efficient for me to allocate and manage the data
 array in the user code.
 
 Is there a GSL implementation which works with lesser memory
 requirement?  E.g. one which needs the full derivatives (the integral
 of dChisq/dParam over all data) rather than needing an array (the
 Jacobian) which holds the derivatives evaluated at each point.  The
 size of the array in the latter case is NxP!  I feel that in the
 current form of GSL implementation, it's un-usable for large
 problems.  Which is disappointing.
 
Regards,
sanjay


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]