Determinant of a square matrix

Definition

The determinant of a square, n \times n matrix A, denoted \det A, is defined by an algebraic formula of the coefficients of A. The following formula for the determinant, known as Laplace’s expansion formula, allows to compute the determinant recursively:

\det A = \sum\limits_{i=1}^n (-1)^{i+1} A_{i,1} \det(C_i),

where A_{i,1} is the (n-1) \times (n-1) matrix obtained from A by removing the i-th row and first column. (The first column does not play a special role here: the determinant remains the same if we use any other column.)

The determinant is the unique function of the entries of A such that

  1. \det I = 1.
  2. A \rightarrow \det A is a linear function of any column (when the others are fixed).
  3. \det A changes sign when two columns are permuted.

There are other expressions of the determinant, including the Leibnitz formula (proven here):

\det A = \sum\limits_{\sigma \in S_n} {\bf sign}(\sigma) A_{\sigma(i), i}

where S_n denotes the set of permutations \sigma of the integers 1,2, \cdots, n. Here, {\bf sign) (\sigma) denotes the sign of the permutation \sigma, which is the number of pairwise exchanges required to transform \sigma(1), \sigma(2), \cdots, \sigma(n) into 1, 2, \cdots, n.

Important result

An important result is that a square matrix is invertible if and only if its determinant is not zero. We use this key result when introducing eigenvalues of symmetric matrices.

Geometry

alt text The determinant of a 3 \times 3 matrix A with columns r_1, r_2, r_3 is the volume of the parallelepiped defined by the vectors r_1, r_2, r_3. (Source: wikipedia). Hence the determinant is a measure of scale that quantifies how the linear map associated with A, x \rightarrow Ax, changes volumes.

In general, the absolute value of the determinant of a n \times n matrix is the volume of the parallelepiped

\{Ax: 0\leq x_i \leq 1, \quad i = 1, \dots, n\}.

This is consistent with the fact that when A is not invertible, its columns define a parallepiped of zero volume.

Determinant and inverse

The determinant can be used to compute the inverse of a square, full-rank (that is, invertible) matrix A: the inverse B=A^{-1} has elements given by

B_{ij} = \frac{(-1)^{i+j}}{\det A} \det (\tilde{A}_{ij}),

, where \tilde{A}_{ij} is a matrix obtained from A by removing its i-th row and j-th column. For example, the determinant of a 2 \times 2 matrix

A = \left(\begin{array}{ll} a & b \\ c & d \end{array}\right)

is given by

\det A = ad-bc.

It is indeed the volume of the area of a parallepiped defined with the columns of A, (a,c),(b,d). The inverse is given by

A^{-1} = \frac{1}{ad-bc} \left(\begin{array}{ll} d & -b \\ -c & a \end{array}\right)

Some properties

Determinant of triangular matrices

If a matrix is square, triangular, then its determinant is simply the product of its diagonal coefficients. This comes right from Laplace’s expansion formula above.

Determinant of transpose

The determinant of a square matrix and that of its transpose are equal.

Determinant of a product of matrices

For two invertible square matrices, we have

\det AB = \det A \cdot \det B.

In particular:

\det A^{-1} = \frac{1}{\det A}.

This also implies that for an orthogonal matrix U, that is, a n times n matrix with U^TU = I, we have

1 = \det U^TU = (\det U^T) \det U = (\det U)^2.

Determinant of block matrices

As a generalization of the above result, we have three compatible blocks A, C, D:

\left(\begin{array}{c|c} A & 0 \\ \hline C & D \end{array}\right) = \det D \cdot \det A.

A more general formula is

\left(\begin{array}{c|c} A & B \\ \hline C & D \end{array}\right) = \det D \cdot \det (A-BD^{-1}C).

License

Hyper-Textbook: Optimization Models and Applications Copyright © by L. El Ghaoui. All Rights Reserved.

Share This Book