By Kardi Teknomo, PhD.

<Next | Previous | Index>

Singular Value Decomposition (SVD)

Singular value decomposition (SVD) is a factorization of a rectangular matrix SVDinto three matricesSVD, SVDandSVD. The two matrices SVDand SVDare orthogonal matrices (SVD,SVD) while SVDis a diagonal matrix. The factorization means that we can multiply the three matrices to get back the original matrixSVD. The transpose matrix is obtained throughSVD.

Since both orthogonal matrix and diagonal matrix have many nice properties, SVD is one of the most powerful matrix decomposition that used in many applications such as least square (regression), feature selection (PCA, MDS), spectral clustering, image restoration and 3D computer vision (Fundamental matrix estimation), equilibrium of Markov Chain,  and many others.

Matrices SVDand SVDare not unique, their columns come from the concatenation of eigenvectors of symmetric matrices SVDandSVD. Since eigenvectors of symmetric matrix are orthogonal (and linearly independent), they can be used as basis vectors (coordinate system) to span a multidimensional space. The absolute value of the determinant of orthogonal matrix is one, thus the matrix always has inverse. Furthermore, each column (and each row) of orthogonal matrix has unit norm.

The diagonal matrix SVDcontains the square of eigenvalues of symmetric matrixSVD. The diagonal elements are non-negative numbers and they are called singular values. Because they come from a symmetric matrix, the eigenvalues (and the eigenvectors) are all real numbers (no complex numbers).

Numerical computation of SVD is stable in term of round off error. When some of the singular values are nearly zero, we can truncate them as zero and it yields numerical stability.


Find SVD of matrixSVD Example .


First, we multiply the matrix SVD Exampleby its transpose to produce symmetric matricesSVD ExampleandSVD Example .

Then we find the eigenvalues and eigenvectors of the symmetric matrices. For matrixSVD Example, the eigenvalues are SVD ExampleandSVD Example. The corresponding eigenvectors areSVD Example and SVD Examplerespectively. Concatenating the eigenvectors produces matrixSVD Example. The diagonal matrix can be obtained from the square root of the eigenvaluesSVD Example.

For matrixSVD Example, the eigenvalues are SVD ExampleandSVD Example. The third eigenvalue is zero as expected because the eigenvalues of SVD Exampleare exactly the same as the eigenvalues of matrixSVD Example. The corresponding eigenvectors areSVD Example , SVD Exampleand SVD Examplerespectively. Concatenating the eigenvectors produces matrixSVD Example. Since the Singular Value Decomposition factor matrixSVD Example, the diagonal matrix can also be obtained fromSVD Example.

Remember, the eigenvectors are actually the many solutions of homogeneous equation. They are not unique and correct up to a scalar multiple. Thus, you can multiply an eigenvector with -1 and will still get the same correct result.


The interactive program below produces the factorization of a rectangular matrix using Singular Value Decomposition (SVD). You can also truncate the results by setting lower singular values to zero. This feature is useful for feature selection (such as PCA and MDS). Random example button will generate random rectangular matrix. Try to experiment with your own input matrix.

dimension (1 to number of columns)

Yes, this program is a free educational program!! Please don't forget to tell your friends and teacher about this awesome program!


In one strike, Singular Value Decomposition (SVD) can reveal many things:

  • Singular values give valuable information whether a square matrix SVDis singular. A square matrix SVD is non-singular (i.e. have inverse) if and only if all its singular values are different from zero.
  • If the square matrix SVDis nonsingular, the inverse matrix can be obtained bySVD.
  • The number of non-zero singular values is equal to the rank of any rectangular matrix. In fact, SVD is a robust technique to compute matrix rank against ill-conditioned matrices.
  • The ratio between the largest and the smallest singular value is called condition number, measures the degree of singularity and to reveal ill-condition matrix.
  • SVD can produce one of matrix norms, which is called Frobenious norm by taking the sum of square of singular valuesSVD. The Frobenious norm is computed by taking the sums of the square elements in the matrix.
  • SVD can also produce generalized inverse (pseudo inverse) for any rectangular matrix. In fact, the generalized inverse is also a Moore-Penrose inverse by settingSVD. Matrix SVDis equal to SVDbut all nearly zero values are set to zero.
  • SVD also approximate the solution of non-homonegeous linear system SVDsuch that the norm is minimumSVD. This is the basis of least square, orthogonal projection and regression analysis.
  • SVD also solve homogeneous linear system by taking the column of SVDwhich represent the eigenvector corresponding to the only zero eigenvalue of symmetric matrixSVD.

See also: Matrix Eigen Value & Eigen Vector for Symmetric Matrix, Similarity and Matrix Diagonalization, Symmetric Matrix, Spectral Decomposition

<Next | Previous | Index>

Rate this tutorial or give your comments about this tutorial

This tutorial is copyrighted.

Preferable reference for this tutorial is

Teknomo, Kardi (2011) Linear Algebra tutorial. https:\\\kardi\tutorial\LinearAlgebra\