17
[latexpage]
17.1. Matrix products
Let $f: \mathbb{R}^m \rightarrow \mathbb{R}^k$ and $g: \mathbb{R}^n \rightarrow \mathbb{R}^m$ be two maps. Let $h: \mathbb{R}^n \rightarrow \mathbb{R}^k$ be the composite map $h=f \circ g$, with values $h(x)=f(g(x))$ for $x \in \mathbb{R}^n$. Show that the derivatives of $h$ can be expressed via a matrix-matrix product, as $J_h(x)=J_f(g(x)) \cdot J_g(x)$, where the Jacobian matrix of $h$ at $x$ is defined as the matrix $J_h(x)$ with $(i, j)$ element $\partial h_i / \partial x_j(x)$.
17.2 Special matrices
A matrix $P \in \mathbb{R}^{n \times n}$ is a permutation matrix if it is a permutation of the columns of the $n \times n$ identity matrix.
a. For a $n \times n$ matrix $A$, we consider the products $P A$ and $A P$. Describe in simple terms what these matrices look like with respect to the original matrix $A$.
b. Show that $P$ is orthogonal.
c. Show that $P^2=I$.
17.3. Linear maps, dynamical systems
1. Let $f: \mathbb{R}^n \rightarrow \mathbb{R}^m$ be a linear map. Show how to compute the (unique) matrix $A$ such that $f(x)=A x$ for every $x \in \mathbb{R}^n$, in terms of the values of $f$ at appropriate vectors, which you will determine.
2. Consider a discrete-time linear dynamical system (for background, see here) with state $x \in \mathbb{R}^n$, input vector $u \in \mathbb{R}^p$, and output vector $y \in \mathbb{R}^k$, that is described by the linear equations
\[ x(t+1) = A x(t) + B u(t), \quad y(t) = C x(t) \]
with $A \in \mathbb{R}^{n \times n}, \; B \in \mathbb{R}^{n \times p}$, and $C \in \mathbb{R}^{k \times n}$ given matrices.
a. Assuming that the system has initial condition $x(0)=0$, express the output vector at time $T$ as a linear function of $u(0), \ldots, u(T)$; that is, determine a matrix $H$ such that $y(T)=H \bar{u}(T)$, where $\bar{u}(T):=(u(0), \ldots, u(T-1))$ is a vector containing all the inputs up to and including at time $T-1$.
b. What is the interpretation of the range of $H$ ?
17.4 Matrix inverses, norms
1. Show that a square matrix is invertible if and only if its determinant is non-zero. You can use the fact that the determinant of a product is a product of the determinant, together with the QR decomposition of the matrix $A$.
2. Let $A \in \mathbb{R}^{m \times n}, B \in \mathbb{R}^{n \times p}$, and let $C:=A B \in \mathbb{R}^{m \times p}$. Show that $\|C\| \leq\|A\| \cdot\|B\|$ where $\|\cdot\|$ denotes the largest singular value norm of its matrix argument.