EN FR
This isn't a substitute for your classwork. Be sure to consult your teacher's materials for a full understanding.

Introduction to Matrices

A matrix is a set of numbers arranged in rows and columns to form a rectangular array. It has a size determined by the number of rows and columns the matrix contains.

Example Sizes of Matrices

  • Vector Matrix Size: A matrix with only one row or one column. For example, a row vector v with size \( 1 \times n \) or a column vector w with size \( m \times 1 \).

    \[ \mathbf{v} = \begin{bmatrix} v_1 & v_2 & \cdots & v_n \end{bmatrix} \]

    \[ \mathbf{w} = \begin{bmatrix} w_1 \\ w_2 \\ \vdots \\ w_m \end{bmatrix} \]

  • Column Matrix Size: Also known as a column vector, has size \( m \times 1 \).

    \[ \mathbf{c} = \begin{bmatrix} c_1 \\ c_2 \\ \vdots \\ c_m \end{bmatrix} \]

  • Square Matrix Size: A matrix with the same number of rows and columns, i.e., \( n \times n \).

    \[ \mathbf{S} = \begin{bmatrix} s_{1,1} & s_{1,2} & \cdots & s_{1,n} \\ s_{2,1} & s_{2,2} & \cdots & s_{2,n} \\ \vdots & \vdots & \ddots & \vdots \\ s_{n,1} & s_{n,2} & \cdots & s_{n,n} \end{bmatrix} \]

For a matrix \( A = (a_{i,j}) \), an element in the \( i \)-th row and \( j \)-th column is denoted \( a_{i,j} \).

Operations on Matrices

Addition

The sum of two matrices of the same size is obtained by adding their corresponding elements.

Definition: Given matrices \( A = (a_{i,j}) \) and \( B = (b_{i,j}) \), their sum \( C = A + B \) is given by:

\[ c_{i,j} = a_{i,j} + b_{i,j} \]

Example:

\[ A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}, \quad B = \begin{bmatrix} 5 & 6 \\ 7 & 8 \end{bmatrix}, \quad A + B = \begin{bmatrix} 1+5 & 2+6 \\ 3+7 & 4+8 \end{bmatrix} = \begin{bmatrix} 6 & 8 \\ 10 & 12 \end{bmatrix} \]

Multiplication

The product of two matrices \( A \) and \( B \) is defined only if the number of columns in \( A \) equals the number of rows in \( B \).

Definition: Given matrices \( A \) of size \( m \times n \) and \( B \) of size \( n \times p \), their product \( C = AB \) is a matrix of size \( m \times p \) where:

\[ c_{i,j} = \sum_{k=1}^{n} a_{i,k} b_{k,j} \]

Example:

\[ A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}, \quad B = \begin{bmatrix} 2 & 0 \\ 1 & 3 \end{bmatrix}, \quad AB = \begin{bmatrix} 1 \cdot 2 + 2 \cdot 1 & 1 \cdot 0 + 2 \cdot 3 \\ 3 \cdot 2 + 4 \cdot 1 & 3 \cdot 0 + 4 \cdot 3 \end{bmatrix} = \begin{bmatrix} 4 & 6 \\ 10 & 12 \end{bmatrix} \]

Inversion

The inverse of a matrix \( A \) is denoted \( A^{-1} \), and it satisfies \( AA^{-1} = A^{-1}A = I \), where \( I \) is the identity matrix.

Definition: For a square matrix \( A \), if there exists a matrix \( B \) such that \( AB = BA = I \), then \( B \) is called the inverse of \( A \).

Example:

\[ A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}, \quad A^{-1} = \frac{1}{\text{det}(A)} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix} \]

Here, \(\text{det}(A) = ad - bc\).

Scalar Multiplication

The product of a matrix \( A \) and a scalar \( \alpha \) is a matrix where each element of \( A \) is multiplied by \( \alpha \).

Definition: Given a matrix \( A = (a_{i,j}) \) and a scalar \( \alpha \), the product \( B = \alpha A \) is given by:

\[ b_{i,j} = \alpha \cdot a_{i,j} \]

Example:

\[ A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}, \quad \alpha = 3, \quad \alpha A = 3 \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} = \begin{bmatrix} 3 & 6 \\ 9 & 12 \end{bmatrix} \]

Transpose of a Matrix

The transpose of a matrix, denoted as \( A^T \), is obtained by swapping the rows and columns of the original matrix \( A \). In other words, the element at the \( i \)-th row and \( j \)-th column of \( A \) becomes the element at the \( j \)-th row and \( i \)-th column of \( A^T \). If \( A \) is an \( m \times n \) matrix, then \( A^T \) will be an \( n \times m \) matrix.

Submatrix

A submatrix is a matrix formed by deleting some of the rows and/or columns from a larger matrix.

Definition: Given a matrix \( A \), a submatrix is obtained by removing specific rows and columns.

Example:

\[ A = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{bmatrix}, \quad \text{Removing the 1st row and 2nd column:} \quad A' = \begin{bmatrix} 4 & 6 \\ 7 & 9 \end{bmatrix} \]

Types of Matrices

In various mathematical computations and applications we face some type of matrices. These are the most important:

Diagonal Matrix

A diagonal matrix is a square matrix in which all the elements outside the main diagonal are zero. The elements on the main diagonal can be either zero or non-zero.

Example:

\[ D = \begin{bmatrix} d_{11} & 0 & 0 \\ 0 & d_{22} & 0 \\ 0 & 0 & d_{33} \end{bmatrix} \]

Here, \( d_{ij} = 0 \) for all \( i \neq j \).

Triangular Matrix

A triangular matrix is a special kind of square matrix where all the elements above or below the main diagonal are zero. There are two types of triangular matrices:

  • Upper Triangular Matrix: All elements below the main diagonal are zero.
  • Lower Triangular Matrix: All elements above the main diagonal are zero.

Example:

Upper Triangular Matrix:

\[ U = \begin{bmatrix} u_{11} & u_{12} & u_{13} \\ 0 & u_{22} & u_{23} \\ 0 & 0 & u_{33} \end{bmatrix} \]

Lower Triangular Matrix:

\[ L = \begin{bmatrix} l_{11} & 0 & 0 \\ l_{21} & l_{22} & 0 \\ l_{31} & l_{32} & l_{33} \end{bmatrix} \]

Symmetric Matrix

A symmetric matrix is a square matrix that is equal to its transpose. In other words, \( A = A^T \), where \( A^T \) is the transpose of \( A \).

Example:

\[ A = \begin{bmatrix} a & b & c \\ b & d & e \\ c & e & f \end{bmatrix} \]

Here, \( a_{ij} = a_{ji} \) for all \( i \) and \( j \).

Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors are fundamental concepts in linear algebra, particularly in the study of linear transformations. Given a square matrix \( A \), an eigenvector \( \mathbf{v} \) is a non-zero vector that, when multiplied by \( A \), results in a vector that is a scalar multiple of \( \mathbf{v} \). Mathematically, this is expressed as:

\[ A \mathbf{v} = \lambda \mathbf{v} \]

where \( \lambda \) is a scalar known as the eigenvalue corresponding to the eigenvector \( \mathbf{v} \). The eigenvalues are found by solving the characteristic equation:

\[ \text{det}(A - \lambda I) = 0 \]

where \( I \) is the identity matrix of the same size as \( A \).

Example

Consider the matrix:

\[ A = \begin{bmatrix} 4 & 1 \\ 2 & 3 \end{bmatrix} \]

To find the eigenvalues, solve the characteristic equation:

\[ \text{det}\left( \begin{bmatrix} 4 & 1 \\ 2 & 3 \end{bmatrix} - \lambda \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \right) = 0 \implies \text{det}\left( \begin{bmatrix} 4-\lambda & 1 \\ 2 & 3-\lambda \end{bmatrix} \right) = 0 \]

Solving this determinant gives the eigenvalues \( \lambda_1 = 5 \) and \( \lambda_2 = 2 \).

The corresponding eigenvectors can be found by substituting each eigenvalue back into the equation \( (A - \lambda I) \mathbf{v} = 0 \) and solving for \( \mathbf{v} \).