.dcmath - UNDER CONSTRUCTION

MATRIX RANK

The dimension of the column space of a matrix is called the column-rank. The dimension of the row space of a matrix is called the row-rank. One of the fundamental properties of linear maps is that the column-rank and row-rank are equal and are thus often just called the rank. We first give several images of a the column and row spaces of a matrix and show that as one drops rank so does the other.

The proof that the column and row rank are equal is dense, and subtle but short. We give it here in compact form.

Column Rank = Row Rank

Proof:

For a matrix \(A \in \mathbb{R}^{m \times n}\), let \(c\) represent the column rank and \(r\) represent the row rank. We will proceed by showing that of necessity \(r \leq c\) and \(r \geq c\) and thus we must have \(r=c\).

Row Rank \(\leq\) Column Rank:

Since the column rank is \(c\), write a basis for the column space of \(A\) as the columns of a matrix \(C \in \mathbb{R}^{m \times c}\). \(A\) can then be written as $$ A = CV = \begin{bmatrix} | & & | \\ C_1 & \cdots & C_c \\ | & & | \end{bmatrix} \begin{bmatrix} | & & | \\ V_1 & \cdots & V_n \\ | & & | \end{bmatrix} $$ where the columns of \(V \in \mathbb{R}^{c \times n}\) are the coordinates of each column of \(A\) with respect to \(C\). Whenever one defines a matrix in terms of columns one also defines its rows; the key to the proof is simply shifting perspectives on \(C\) and \(V\) writing them in terms of their rows instead of columns The above equation then becomes $$ A = CV = \begin{bmatrix} - & \bar{C}_1^T & - \\ & \vdots & \\ - & \bar{C}_m^T & - \end{bmatrix} \begin{bmatrix} - & \bar{V}_1^T & - \\ & \vdots & \\ - & \bar{V}_c^T & - \end{bmatrix} $$ From this row-row geometry perspective we can now think about the rows of the product as linear combinations of the rows of \(V\) with coefficients given by the rows of \(C\). We then have immediately that each row of \(A\) is in the row space of \(V\) and thus the dimension of the row space of \(V\), \(r\) is less than or equal to \(c\), ie. \(r \leq c\).

Row Rank \(\geq\) Column Rank:

We have the second direction immediately by applying the above reasoning to \(A^T\) but we reproduce it here for symmetry and practice purposes.

Given that row rank is \(r\), write \(A\) as $$ A = WR = \begin{bmatrix} - & \bar{W}_1^T & - \\ & \vdots & \\ - & \bar{W}_m^T & - \end{bmatrix} \begin{bmatrix} - & \bar{R}_1^T & - \\ & \vdots & \\ - & \bar{R}_r^T & - \end{bmatrix} $$ where the rows of rows of \(R \in \mathbb{R}^{r \times n}\) are a basis for the rowspace and the rows of \(W \in \mathbb{R}^{m \times r}\) are the coordinates of the rows of \(A\) with respect to the rows of \(R\). Switch to viewing \(W\) and \(R\) as columns $$ A = WR = \begin{bmatrix} | & & | \\ W_1 & \cdots & W_r \\ | & & | \end{bmatrix} \begin{bmatrix} | & & | \\ R_1 & \cdots & R_n \\ | & & | \end{bmatrix} $$ and we have that each column of \(A\) is a linear combination of the columns of \(W\) and thus the dimension of the column space is less than or equal to \(r\), ie. \(r \geq c\).

Rank-Nullity Theorem

The second major rank result is called the rank-nullity theorem: the rank of a matrix plus the dimension of the nullspace (or nullity) is equal to the dimension of the domain. For \(A \in \mathbb{R}^{m \times n} \) $$ rank(A)+null(A) = n $$

Given the row geometry of a matrix and it's nullspace, this result is actually quite intuitive. The rank of a matrix is the number of linearly independent rows; the nullspace is the orthogonal subspace to these rows. Together their dimensions is naturally the dimension of the overall space, \(n\). We first give several pictures that show this result for different matrices.

The proof is easier done by considering the column geometry of \(A\) and follows immediately from the nullspace basis construction discussed previously. As before, any matrix \(A \in \mathbb{R}^{m \times n}\) of rank \(k\) has exactly \(k\) linearly independent columns. Assuming (wlog) they are the first \(k\) columns then \(A\) can be decomposed as $$ A = \Big[ A' \ A'' \Big] = \Big[ A' \ A'B \Big] =A'\Big[ I \ B \Big] $$ with linearly independent \(A' \in \mathbb{R}^{m \times k}\) and linearly dependent \(A'' \in \mathbb{R}^{m \times (n-k)}\) that can be written as \(A'' = A'B\) where the columns of \(B\) give coefficients of the columns of \(A''\) with respect to \(A'\). The matrix $$ N = \begin{bmatrix} B \\ -I \end{bmatrix} \in \mathbb{R}^{n \times (n-k)} $$ then forms a basis for the nullspace of \(A\) (as previously shown) and thus the nullity of \(A\) is \(n-k\). We then immediately have that $$ n = k + (n-k) = rank(A) + null(A) $$