EN FR

LU Decomposition

LU decomposition is a method of decomposing a matrix \(A\) into the product of a lower triangular matrix \(L\) and an upper triangular matrix \(U\). This decomposition is useful in solving systems of linear equations, inverting matrices, and computing determinants.

Theorem:

Any \(n \times n\) matrix \(A\) has an LU decomposition if and only if:

\[ \text{rank}(A[{1, \ldots, k}]) + k \geq \text{rank}(A[{1, \ldots, k}, {1, \ldots, n}]) + \text{rank}(A[{1, \ldots, n}, {1, \ldots, k}]) \text{ for all } k = 1, \ldots, n. \]

This theorem provides a necessary and sufficient condition for the existence of an LU decomposition. It ensures that the rank conditions are satisfied at each step of the decomposition process. Essentially, it means that for each leading principal submatrix of \(A\), the rank conditions must hold true.

Theorem:

Any \(n \times n\) matrix \(A\) can be written as:

\[A = U_1 L U_2\]

Here, \(U_1\) and \(U_2\) are upper triangular matrices, and \(L\) is a lower triangular matrix.

This theorem highlights a more generalized form of LU decomposition, showing that multiple upper and lower triangular matrices can be involved in the factorization process.

Theorem:

Any \(n \times n\) matrix can be written as:

\[A = L_1 U L_2\]

In this form, \(L_1\) and \(L_2\) are lower triangular matrices, and \(U\) is an upper triangular matrix.

This theorem provides another perspective on matrix decomposition, indicating that a combination of lower and upper triangular matrices can also represent the original matrix \(A\).

Theorem:

Any \(n \times n\) matrix \(A\) can be written as:

\[A = P L U\]

where \(P\) is a permutation matrix.

This theorem is particularly important in practical applications of LU decomposition, as it accounts for the possibility of row exchanges to ensure numerical stability. The permutation matrix \(P\) reorders the rows of \(A\) to allow for a successful decomposition into \(L\) and \(U\).

Cholesky decomposition

The Cholesky decomposition is a method for decomposing a Hermitian positive-definite matrix \(A\) into the product of a lower triangular matrix \(L\) and its conjugate transpose \(L^*\). This decomposition is particularly useful in numerical analysis and linear algebra for solving systems of linear equations, inverting matrices, and performing other matrix operations.

A real and symmetric matrix is a special case of a Hermitian matrix.

Theorem: Cholesky Decomposition

For any Hermitian positive-definite matrix \(A\), there exists a unique lower triangular matrix \(L\) such that:

\[A = L \times L^*\]

where \(L\) is a lower triangular matrix with real and positive diagonal entries, and \(L^*\) denotes the conjugate transpose of \(L\).

This theorem ensures that the Cholesky decomposition is unique for any given Hermitian positive-definite matrix.

Theorem: LDL Decomposition

A closely related variant of the classical Cholesky decomposition is the LDL decomposition, which states that any Hermitian positive-definite matrix \(A\) can be decomposed as:

\[A = L \times D \times L^*\]

where \(L\) is a lower unit triangular (unitriangular) matrix, and \(D\) is a diagonal matrix.

In this decomposition, the diagonal elements of \(L\) are required to be 1, and the diagonal matrix \(D\) is introduced to account for the diagonal entries of \(A\).

QR decompostion

QR decomposition is a method of decomposing a matrix \(A\) into the product of an orthogonal matrix \(Q\) and an upper triangular matrix \(R\). This decomposition is useful in solving linear systems, least squares fitting, and computing eigenvalues.

Theorem: QR Decomposition

For any \(m \times n\) matrix \(A\), there exists an \(m \times m\) orthogonal matrix \(Q\) and an \(m \times n\) upper triangular matrix \(R\) such that:

\[A = Q \times R\]

where \(Q\) has orthonormal columns (i.e., \(Q^T Q = I\)) and \(R\) is an upper triangular matrix.

This theorem ensures that QR decomposition can be applied to any rectangular matrix.

SVD transformation

The singular value decomposition (SVD) is a factorization of a real or complex matrix into a rotation, followed by a rescaling, and then another rotation. Specifically, the singular value decomposition of an \(m \times n\) complex matrix \(\mathbf{M}\) is a factorization of the form:

\[\mathbf{M} = \mathbf{U\Sigma V}^{*}\]

where \(\mathbf{U}\) is an \(m \times m\) complex unitary matrix, \(\mathbf{\Sigma}\) is an \(m \times n\) rectangular diagonal matrix with non-negative real numbers on the diagonal, \(\mathbf{V}\) is an \(n \times n\) complex unitary matrix, and \(\mathbf{V}^{*}\) is the conjugate transpose of \(\mathbf{V}\). Such a decomposition always exists for any complex matrix.

If \(\mathbf{M}\) is real, then \(\mathbf{U}\) and \(\mathbf{V}\) can be guaranteed to be real orthogonal matrices; in such contexts, the SVD is often denoted:

\[\mathbf{U\Sigma V}^{\mathrm{T}}\]

Theorem: Existence of SVD

For any \(m \times n\) matrix \(\mathbf{M}\), there exist unitary matrices \(\mathbf{U}\) and \(\mathbf{V}\), and a diagonal matrix \(\mathbf{\Sigma}\), such that:

\[\mathbf{M} = \mathbf{U\Sigma V}^{*}\]

Here, the diagonal entries of \(\mathbf{\Sigma}\) are the singular values of \(\mathbf{M}\), which are non-negative real numbers. The columns of \(\mathbf{U}\) are the left singular vectors, and the columns of \(\mathbf{V}\) are the right singular vectors of \(\mathbf{M}\).

Properties of SVD

The singular value decomposition has several important properties:

  • Uniqueness: The singular values are unique, but the singular vectors are not unique if some singular values are equal.
  • Orthogonality: The matrices \(\mathbf{U}\) and \(\mathbf{V}\) are unitary, meaning their columns are orthonormal vectors.
  • Rank: The number of non-zero singular values is equal to the rank of \(\mathbf{M}\).

References and more