# Why SVD is slow when compared with eig?

Discussion in 'MATLAB' started by Hiu Chung Law, Sep 17, 2004.

1. ### Hiu Chung LawGuest

Any idea why SVD is slow when compared with eigenvalue
decomposition? The code below did essentially the same thing
(with the change of sign of eigenvector/singular vector),
but svd is considerably slower. My guess is that
svd is probably more accurate than doing eig indirectly.
Still, the almost 4 times difference in speed is a bit too much.

ans =
5.2277e-13
% A small number indicating that s and s2 are effectively the same.-------------------------------------------------------------------------------------
MATLAB Version 7.0.0.19901 (R14)
Operating System: SunOS 5.9 Generic_117171-05 sun4u
Java VM Version: Java is not enabled
-------------------------------------------------------------------------------------
MATLAB Version 7.0 (R14)
Image Processing Toolbox Version 4.2 (R14)
Neural Network Toolbox Version 4.0.3 (R14)
Optimization Toolbox Version 3.0 (R14)
Spline Toolbox Version 3.2.1 (R14)
Statistics Toolbox Version 5.0 (R14)

Hiu Chung Law, Sep 17, 2004

2. ### John D'ErricoGuest

Yes, EIG can be both faster and less accurate than SVD. SO?

Different tools have different strengths. Hammers can both
pound in nails as well as shell hardboiled eggs. When I am
asked to shell eggs, my wife prefers that I not use the
hammer, even when I assure her how much faster it will go.

John D'Errico

--

The best material model of a cat is another, or
preferably the same, cat.
A. Rosenblueth, Philosophy of Science, 1945

John D'Errico, Sep 17, 2004

3. ### Peter SpellucciGuest

you forgot a lot:
svd computes the left and the right singular vectors of X, eig
only the eigenvectors of X'*X (the right singular vectors). But the dimension
of the left vectors is much larger, this gives a lot more of work.
secondly, svd works on the bidiagonal form and first must transform X to that.
this is also much more work than transforming X'*X to tridiagonal form.
and finally, in reducing the bidiagonal form to a diagonal one, svd must do the
"chasing" of the nonzero element outside, which again is double work compared
with QR-decomposition of the tridiagonal matrix.
hence, if the number of rows of X would even be larger than in your case, the
time would grow even more.
hth
peter

Peter Spellucci, Sep 17, 2004
4. ### Hiu Chung LawGuest

However, I don't understand here. Since the left singular vectors can be
obtained by the right singular vectors (by orthogonalizng the matrix X V,
which is the same as U S. Assuming X = U S V' ), why do we need to
estimate the left singular vectors directly? Is it because that would be
numerically more stable?

I come up with the original question when I try to follow some matlab
program on the web. The authors want to perform SVD at a certain point.
Instead of going through the SVD route, they go through the eig
route. At first I thought this was a bad idea. However, when I
measured the execution time of those statements, I realized that what the
authors did is in fact a better idea -- at least in terms of efficiency.
That's why I am puzzled.

Maybe I should borrow Golub's Matrix Computation from the library again...
There is one section on SVD.... Or is there any better reference on SVD?

Hiu Chung Law, Sep 18, 2004
5. ### Peter SpellucciGuest

you did explicitly require U in your command, hence it is computed.

yes, you can compute

A=U*S*V' => \tilde U=A*V, then normalize to length one to get U
if you need only the "economy size" data.
(and have the original A , which is for example not necessary with LAPACK code)
should have no much less numerical stability in this case.
yes : Golub-van Loan is the right source to read
hth
Peter

Peter Spellucci, Sep 20, 2004
6. ### Jesper GöranssonGuest

Try the mean matrix X = [1 eps; 0 1] and you will see the difference.
Remember the one rule in numerical linear algebra: Never ever use the
normal equations.

Cheers,
Jesper

Jesper Göransson, Sep 20, 2004
7. ### Jesper GöranssonGuest

Ah, forget it.

Jesper Göransson, Sep 20, 2004