This is the mail archive of the gsl-discuss@sources.redhat.com mailing list for the GSL project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Re: (HH^t)^{-1}




On Wed, 10 Oct 2001, Michael Meyer wrote:

> I saw your post regarding matrix inverses on
> gsl_discuss.
>
> Sampling from a multinormal distribution (given the
> mean and coavariance  matrix) usually does not involve
> computation of inverses.
>
> As long as you can draw from a standard normal
> distribution all one needs is a Cholesky factorization
> of the covariance matrix. This is a very easy
> algorithm one can easily write oneself (10 lines).
>
> I do not fully understand how (HH^t)^{-1} is related
> to the mean of the distribution. This mean should be a
> vector not a matrix.
> Please provide details of your problem.

Well, to be precise...

Given a vector h=(h_1, ... h_T) of length T, then we define a matrix H of
dimension T X 2 whose kth row is (1,h_{k-1}), where h_0 = 0. Then I want
to simulate from a bivariate normal distribution with

mean \mu = (H^t H)^{-1} H^t h and variance (H^t H)_{-1}.

This can be written as

X = H^{t}Z + \mu where Z is standard normal and X has the required
bivariate normal distribution. I don't see any way of avoiding calculation
of the inverse here.

I need to write a function which takes the value Z (the standard normal)
and returns X.

If you've any suggestions, let me know.

                                  Sincerely, Faheem Mitha.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]