This is the mail archive of the
gsl-discuss@sources.redhat.com
mailing list for the GSL project.
Re: Problem with Singular Value Decomposition Algorithm
- To: "Jim Love" <Jim dot Love at asml dot com>
- Subject: Re: Problem with Singular Value Decomposition Algorithm
- From: Brian Gough <bjg at network-theory dot co dot uk>
- Date: Thu, 13 Sep 2001 17:21:02 +0100 (BST)
- Cc: <gsl-discuss at sources dot redhat dot com>
- References: <sba09ba1.005@wiltonhub.svg.com>
Columns 2 and 3 have the opposite sign, but this is the arbitrary
minus-sign factor referred to earlier. The results satisfy m =
u*diag(s)*v' and u'*u = I, v'*v = I numerically --- so the
decomposition looks ok. Let me know if there's something I missing
here.
regards
Brian Gough
Jim Love writes:
> This code fixes the order problem of the S vector and the other matrix, but their is still a sign problem. Using this matrix for A:
>
> 1.000000 1.000000 0.975000
> 1.000000 -1.000000 0.975000
> -1.000000 -1.000000 -0.925000
> -1.000000 1.000000 -1.025000
>
> The modified code produces:
>
> s:
> 2.793961 2.000000 0.035791
>
> This is correct!
>
> For V:
>
> -0.715538 -0.025633 0.698103
> 0.018347 -0.999671 -0.017900
> -0.698332 -0.000000 -0.715774
>
> This is NOT correct!
>
> The correct answer for V is:
>
> -0.7155 0.0256 -0.6981
> 0.0183 0.9997 0.0179
> -0.6983 -0.0000 0.7158
>
> U is also wrong: the program outputs:
>
> -0.493230 -0.512652 -0.493875
> -0.506363 0.487019 0.506368
> 0.480733 0.512652 -0.506047
> 0.518861 -0.487019 0.493554
>
> The correct U is:
>
> -0.4932 0.5127 0.4939
> -0.5064 -0.4870 -0.5064
> 0.4807 -0.5127 0.5060
> 0.5189 0.4870 -0.4936
>
> Note last column missing for both solutions for U.
>