.dcmath - UNDER CONSTRUCTION

Row-Reduction: General Case

In general, for a matrix \( A \in \mathbb{R}^{m \times n} \) with rank \(k \) we can find an invertible \(E \in \mathbb{R}^{m \times m} \) that is a composition of elementary matrices \( E = E_\ell \cdots E_1 \) such that $$ EA = \begin{bmatrix} I & B \\ 0 & 0 \end{bmatrix} $$ with \( I \in \mathbb{R}^{k \times k} \) and \( B \in \mathbb{R}^{k \times n-k} \). It will be helpful to decompose \( E \) and also \( E^{-1} \) as $$ E = \begin{bmatrix} - & E' & - \\ - & E'' & - \end{bmatrix}, \qquad E^{-1} \begin{bmatrix} | & | \\ F' & F'' \\ | & | \end{bmatrix} $$ where \( E' \in \mathbb{R}^{k \times m} \), \(E'' \in \mathbb{m-k \times m} \), \( F' \in \mathbb{R}^{m \times k} \), and \(F'' \in \mathbb{m \times m-k} \). Note that since the columns (or rows) of \(E'\), \(E''\), \(F'\), \(F''\) are all columns (or rows) of invertible matrices, they must be linearly independent. We note also that $$ I = EE^{-1} = \begin{bmatrix} - & E' & - \\ - & E'' & - \end{bmatrix} \begin{bmatrix} | & | \\ F' & F'' \\ | & | \end{bmatrix} = \begin{bmatrix} E'F' & E'F'' \\ E''F' & E'' F'' \end{bmatrix} = \begin{bmatrix} I & 0 \\ 0 & I \end{bmatrix} $$ Specifically, note which submatrices must be orthogonal. We also have that \( I = E^{-1}E = F'E'+F''E'' \). Using the above decomposition the row-reduction operations become $$ EA = \begin{bmatrix} - & E' & - \\ - & E'' & - \end{bmatrix} \begin{bmatrix} | & | \\ A' & A'' \\ | & | \end{bmatrix} = \begin{bmatrix} E'A' & E'A'' \\ E''A' & E''A'' \end{bmatrix} = \begin{bmatrix} I & B \\ 0 & 0 \end{bmatrix} $$ It can also be quite useful to write the above equation as decomposition of \(A\) $$ A = E^{-1} \begin{bmatrix} I & B \\ 0 & 0 \end{bmatrix} = \begin{bmatrix} | & | \\ F' & F'' \\ | & | \end{bmatrix} \begin{bmatrix} I & B \\ 0 & 0 \end{bmatrix} = F' \begin{bmatrix} I & B \end{bmatrix} $$ It is clear from this that \( F' \) must span the range of \(A\). Since the columns are linearly independent it is also a basis. We also have that the rows of \(E''\) form a basis for the nullspace of \(A^T\) by a similar linear independence argument, rank-nullity (applied to the co-domain) and the fact that \(E''F' = 0 \). Note that the rows of \(\begin{bmatrix}I & B \end{bmatrix}\) are also linearly independent (since the first subblock is the identity) and thus \(\begin{bmatrix}I & B \end{bmatrix}^T\) is a basis for the range of \(A^T\) Finally, by arguments given in the discussion on nullspaces, the rows of \(\begin{bmatrix}B^T & -I \end{bmatrix}^T\) are a basis for the nullspace of \(A\). We can summarize these insights in a list of bases for the four fundamental subspaces related to \(A\). $$ \substack{Range \\ of \ A}: \ \ F', \quad \substack{Nullspace \\ of \ A^T}: \ \ {E''}^T \quad \substack{Range \\ of \ A^T}: \begin{bmatrix} I \\ B^T \end{bmatrix}, \quad \substack{Nullspace \\ of \ A }: \begin{bmatrix} B \\ -I \end{bmatrix} $$

Column-Reduction: General Case

We can make a similar argument to the above for column reduction in the general case. For a matrix \( A \in \mathbb{R}^{m \times n} \) with rank \(k \) we can find an invertible \(E \in \mathbb{R}^{m \times m} \) that is a composition of elementary matrices \( E = E_1 \cdots E_\ell \) such that $$ AE = \begin{bmatrix} I & 0 \\ C & 0 \end{bmatrix} $$ with \( I \in \mathbb{R}^{k \times k} \) and \( C \in \mathbb{R}^{k \times m-k} \). Note here this composition of elementary matrices \(E\) will be different than in the row reduction case. Again it will be helpful to decompose \( E \) and also \( E^{-1} \) as $$ E = \begin{bmatrix} | & | \\ E' & E'' \\ | & | \end{bmatrix}, \qquad E^{-1} \begin{bmatrix} - & F' & - \\ - & F'' & - \\ \end{bmatrix} $$ where \( E' \in \mathbb{R}^{n \times k} \), \(E'' \in \mathbb{n \times n-k} \), \( F' \in \mathbb{R}^{k \times n} \), and \(F'' \in \mathbb{n-k \times n} \). Note that since the columns (or rows) of \(E'\), \(E''\), \(F'\), \(F''\) are all columns (or rows) of invertible matrices, they must be linearly independent. We note also that $$ I = E^{-1}E = \begin{bmatrix} - & F' & - \\ - & F'' & - \end{bmatrix} \begin{bmatrix} | & | \\ E' & E'' \\ | & | \end{bmatrix} = \begin{bmatrix} F'E' & F'E'' \\ F''E' & F''E'' \end{bmatrix} = \begin{bmatrix} I & 0 \\ 0 & I \end{bmatrix} $$ Specifically, note which submatrices must be orthogonal. We also have that \( I = EE^{-1} = E'F'+E''F'' \). Using the above decomposition the column-reduction operations become $$ AE = \begin{bmatrix} - & A' & - \\ - & A'' & - \end{bmatrix} \begin{bmatrix} | & | \\ E' & E'' \\ | & | \end{bmatrix} = \begin{bmatrix} A'E' & A'E'' \\ A''E' & A''E'' \end{bmatrix} = \begin{bmatrix} I & 0 \\ C & 0 \end{bmatrix} $$ It can also be quite useful to write the above equation as decomposition of \(A\) $$ A = \begin{bmatrix} I & 0 \\ C & 0 \end{bmatrix}E^{-1}= \begin{bmatrix} I & 0 \\ C & 0 \end{bmatrix} \begin{bmatrix} - & F' & - \\ - & F'' & - \end{bmatrix} = \begin{bmatrix} I \\ C \end{bmatrix}F' $$ It is clear from this that the rows of \( F' \) must span the range of \(A^T\). Since the rows are linearly independent it is also a basis. We also have that the columns of \(E''\) form a basis for the nullspace of \(A\) by a similar linear independence argument, rank-nullity (applied to the co-domain) and the fact that \(F'E'' = 0 \). Note that the columns of \(\begin{bmatrix}I \\ C \end{bmatrix}\) are also linearly independent (since the first subblock is the identity) and thus \(\begin{bmatrix}I \\ C \end{bmatrix}^T\) is a basis for the range of \(A\) Finally, by arguments given in the discussion on nullspaces, the rows of \(\begin{bmatrix}C & -I \end{bmatrix} \) are a basis for the nullspace of \(A^T\). We can summarize these insights in a list of bases for the four fundamental subspaces related to \(A\). $$ \substack{Range \\ of \ A}: \begin{bmatrix} I \\ C \end{bmatrix}, \quad \substack{Nullspace \\ of \ A^T }: \begin{bmatrix} C^T \\ -I \end{bmatrix} \quad \substack{Range \\ of \ A^T}: \ \ {F'}^T, \quad \substack{Nullspace \\ of \ A}: \ \ E'' $$

GAUSSIAN ELIMINATION

Gaussian elimination (or row/column reduction) mechanizes the process of converting a matrix inverse and thus solving systems of equations. The idea is a staple of early linear algebra courses and extends the process of solving systems of equations by taking linear combinations of them. As previously discussed in the section on elementary matrices, elementary row (or column) operations can be encoded in multiplication by an elementary matrix. Here we will present row-reduction in detail and only mention column reduction briefly however all the results could be presented for column reduction instead.

Row-Reduction: Example

$$ \Big[ \ \ A \ \ \Big| \ \ I \ \ \Big] = \begin{bmatrix} \left. \begin{matrix} 2 & 1 & 1 & \\ 1 & 2 & 1 & \\ 1 & 1 & 2 & \\ \end{matrix} \right| & \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} $$ $$ Step 1: \quad \underbrace{ \begin{bmatrix} 1/2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} }_{E_1} \times \begin{bmatrix} \left. \begin{matrix} 2 & 1 & 1 & \\ 1 & 2 & 1 & \\ 1 & 1 & 2 & \\ \end{matrix} \right| & \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \left. \begin{matrix} 1 & 1/2 & 1/2 & \\ 1 & 2 & 1 & \\ 1 & 1 & 2 & \\ \end{matrix} \right| & \begin{matrix} 1/2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} $$ $$ Step 2,3: \quad \underbrace{ \begin{bmatrix} 1 & 0 & 0 \\ -1 & 1 & 0 \\ -1 & 0 & 1 \end{bmatrix} }_{E_{23}} \times \begin{bmatrix} \left. \begin{matrix} 1 & 1/2 & 1/2 & \\ 1 & 2 & 1 & \\ 1 & 1 & 2 & \\ \end{matrix} \right| & \begin{matrix} 1/2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \left. \begin{matrix} 1 & 1/2 & 1/2 & \\ 0 & 3/2 & 1/2 & \\ 0 & 1/2 & 3/2 & \\ \end{matrix} \right| & \begin{matrix} 1/2 & 0 & 0 \\ -1/2 & 1 & 0 \\ -1/2 & 0 & 1 \end{matrix} \end{bmatrix} $$ $$ Step 4: \quad \underbrace{ \begin{bmatrix} 1 & 0 & 0 \\ 0 & 2/3 & 0 \\ 0 & 0 & 1 \end{bmatrix} }_{E_{4}} \times \begin{bmatrix} \left. \begin{matrix} 1 & 1/2 & 1/2 & \\ 0 & 3/2 & 1/2 & \\ 0 & 1/2 & 3/2 & \\ \end{matrix} \right| & \begin{matrix} 1/2 & 0 & 0 \\ -1/2 & 1 & 0 \\ -1/2 & 0 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \left. \begin{matrix} 1 & 1/2 & 1/2 & \\ 0 & 1 & 1/3 & \\ 0 & 1/2 & 3/2 & \\ \end{matrix} \right| & \begin{matrix} 1/2 & 0 & 0 \\ -1/3 & 2/3 & 0 \\ -1/2 & 0 & 1 \end{matrix} \end{bmatrix} $$ $$ Step 5,6: \quad \underbrace{ \begin{bmatrix} 1 & -1/2 & 0 \\ 0 & 1 & 0 \\ 0 & -1/2 & 1 \end{bmatrix} }_{E_{5,6}} \times \begin{bmatrix} \left. \begin{matrix} 1 & 1/2 & 1/2 & \\ 0 & 1 & 1/3 & \\ 0 & 1/2 & 3/2 & \\ \end{matrix} \right| & \begin{matrix} 1/2 & 0 & 0 \\ -1/3 & 2/3 & 0 \\ -1/2 & 0 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \left. \begin{matrix} 1 & 0 & 1/3 & \\ 0 & 1 & 1/3 & \\ 0 & 0 & 4/3 & \\ \end{matrix} \right| & \begin{matrix} 2/3 & -1/3 & 0 \\ -1/3 & 2/3 & 0 \\ -1/3 & -1/3 & 1 \end{matrix} \end{bmatrix} $$ $$ Step 7: \quad \underbrace{ \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 3/4 \end{bmatrix} }_{E_{7}} \times \begin{bmatrix} \left. \begin{matrix} 1 & 0 & 1/3 & \\ 0 & 1 & 1/3 & \\ 0 & 0 & 4/3 & \\ \end{matrix} \right| & \begin{matrix} 2/3 & -1/3 & 0 \\ -1/3 & 2/3 & 0 \\ -1/3 & -1/3 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \left. \begin{matrix} 1 & 0 & 1/3 & \\ 0 & 1 & 1/3 & \\ 0 & 0 & 1 & \\ \end{matrix} \right| & \begin{matrix} 2/3 & -1/3 & 0 \\ -1/3 & 2/3 & 0 \\ -1/4 & -1/4 & 3/4 \end{matrix} \end{bmatrix} $$ $$ Step 8,9: \quad \underbrace{ \begin{bmatrix} 1 & 0 & -1/3 \\ 0 & 1 & -1/3 \\ 0 & 0 & 1 \end{bmatrix} }_{E_{8,9}} \times \begin{bmatrix} \left. \begin{matrix} 1 & 0 & 1/3 & \\ 0 & 1 & 1/3 & \\ 0 & 0 & 4/3 & \\ \end{matrix} \right| & \begin{matrix} 2/3 & -1/3 & 0 \\ -1/3 & 2/3 & 0 \\ -1/3 & -1/3 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \left. \begin{matrix} 1 & 0 & 0 & \\ 0 & 1 & 0 & \\ 0 & 0 & 1 & \\ \end{matrix} \right| & \begin{matrix} 3/4 & -1/4 & -1/4 \\ -1/4 & 3/4 & -1/4 \\ -1/4 & -1/4 & 3/4 \end{matrix} \end{bmatrix} $$

Elementary Row Operations (Column Geometry)
Column-Reduction: Example

$$ \left[ \begin{matrix} A \\ --- \\ I \\ \end{matrix} \right] = \begin{bmatrix} \begin{matrix} 2 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 2 \\ \end{matrix} \\ --- \\ \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} $$ $$ Step 1: \quad \quad \begin{bmatrix} \begin{matrix} 1 & 1 & 1 \\ 1/2 & 2 & 1 \\ 1/2 & 1 & 2 \\ \end{matrix} \\ --- \\ \begin{matrix} 1/2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \begin{matrix} 2 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 2 \\ \end{matrix} \\ --- \\ \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} \times \underbrace{ \begin{bmatrix} 1/2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} }_{E_1} $$ $$ Step 2,3: \quad \quad \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 1/2 & 3/2 & 1/2 \\ 1/2 & 1/2 & 3/2 \\ \end{matrix} \\ --- \\ \begin{matrix} 1/2 & -1/2 & -1/2 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \begin{matrix} 1 & 1 & 1 \\ 1/2 & 2 & 1 \\ 1/2 & 1 & 2 \\ \end{matrix} \\ --- \\ \begin{matrix} 1/2 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} \times \begin{bmatrix} 1 & -1 & -1 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} $$ $$ Step 4: \quad \quad \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 1/2 & 1 & 1/2 \\ 1/2 & 1/3 & 3/2 \\ \end{matrix} \\ --- \\ \begin{matrix} 1/2 & -1/3 & -1/2 \\ 0 & 2/3 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 1/2 & 3/2 & 1/2 \\ 1/2 & 1/2 & 3/2 \\ \end{matrix} \\ --- \\ \begin{matrix} 1/2 & -1/2 & -1/2 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} \times \begin{bmatrix} 1 & 0 & 0 \\ 0 & 2/3 & 0 \\ 0 & 0 & 1 \end{bmatrix} $$ $$ Step 5,6: \quad \quad \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 1/3 & 1/3 & 4/3 \\ \end{matrix} \\ --- \\ \begin{matrix} 2/3 & -1/3 & -1/3 \\ -1/3 & 2/3 & -1/3 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} = \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 1/2 & 1 & 1/2 \\ 1/2 & 1/3 & 3/2 \\ \end{matrix} \\ --- \\ \begin{matrix} 1/2 & -1/3 & -1/2 \\ 0 & 2/3 & 0 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} \times \begin{bmatrix} 1 & 0 & 0 \\ -1/2 & 1 & -1/2 \\ 0 & 0 & 1 \end{bmatrix} $$ $$ Step 7: \quad \quad \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 1/3 & 1/3 & 1 \\ \end{matrix} \\ --- \\ \begin{matrix} 2/3 & -1/3 & -1/4 \\ -1/3 & 2/3 & -1/4 \\ 0 & 0 & 3/4 \end{matrix} \end{bmatrix} = \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 1/3 & 1/3 & 4/3 \\ \end{matrix} \\ --- \\ \begin{matrix} 2/3 & -1/3 & -1/3 \\ -1/3 & 2/3 & -1/3 \\ 0 & 0 & 1 \end{matrix} \end{bmatrix} \times \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 3/4 \end{bmatrix} $$ $$ Step 8,9: \quad \quad \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ \end{matrix} \\ --- \\ \begin{matrix} 3/4 & -1/4 & -1/4 \\ -1/4 & 3/4 & -1/4 \\ -1/4 & -1/4 & 3/4 \end{matrix} \end{bmatrix} = \begin{bmatrix} \begin{matrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 1/3 & 1/3 & 1 \\ \end{matrix} \\ --- \\ \begin{matrix} 2/3 & -1/3 & -1/4 \\ -1/3 & 2/3 & -1/4 \\ 0 & 0 & 3/4 \end{matrix} \end{bmatrix} \times \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ -1/3 & -1/3 & 1 \end{bmatrix} $$

Elementary Column Operations (Column Geometry)