We will soon expand on general bases and inverses in much more detail, but we first make a brief digression to talk about a particularly well-behaved type of basis called an orthonormal basis. We will denote these as the columns of \(R \in \mathbb{R}^{n \times n}\).
Orthonormal Columns and RowsAn orthonormal basis has basis vectors that are mutually orthogonal and all of length 1. If we store these as the columns of a matrix \(R \in \mathbb{R}^{n \times n}\), we will show shortly that the rows of \(R\) are orthonormal as well.
Inverse \(=\) TransposeThe inverse of an orthonormal transform is simply it's transpose. We can check this quickly by considering the row-column geometry of \(R^TR\). $$ R^TR = \begin{bmatrix} - & R_1^T & - \\ & \vdots & \\ - & R_n^T & - \\ \end{bmatrix} \begin{bmatrix} | & & | \\ R_1 & \cdots & R_n \\ | & & | \end{bmatrix} = \begin{bmatrix} R_1^TR_1 & \cdots & R_1^TR_n \\ \vdots & & \vdots \\ R_n^TR_1 & \cdots & R_n^TR_n \end{bmatrix} = \begin{bmatrix} 1 & \cdots & 0 \\ \vdots & & \vdots \\ 0 & \cdots & 1 \end{bmatrix} $$ This last equality is simply the definition of orthnormality of the columns of \(R\). In general, inverses are difficult to compute, but for orthonormal transforms computing the inverse is as simple as rearranging the values of \(R\).
Note that since left and right inverses of a square matrix must be the same (if they both exist), \(R^TR = I\) implies that $$ I = RR^T = \begin{bmatrix} - & \bar{R}_1^T & - \\ & \vdots & \\ - & \bar{R}_n^T & - \\ \end{bmatrix} \begin{bmatrix} | & & | \\ \bar{R}_1 & \cdots & \bar{R}_n \\ | & & | \end{bmatrix} = \begin{bmatrix} \bar{R}_1^T\bar{R}_1 & \cdots & \bar{R}_1^T\bar{R}_n \\ \vdots & & \vdots \\ \bar{R}_n^T\bar{R}_1 & \cdots & \bar{R}_n^T\bar{R}_n \end{bmatrix} = \begin{bmatrix} 1 & \cdots & 0 \\ \vdots & & \vdots \\ 0 & \cdots & 1 \end{bmatrix} $$ and thus the rows of \(R\) must be orthonormal as well. Preserves Inner Products (Isometry)
Perhaps the most important consequence of the inverse condition above is that orthonormal transformations preserve the values of inner products on a space. Explicitly, if \(y=Ry'\) and \(x = Rx'\), then $$ y^Tx = y'^TR^TRx' = y'^Tx' $$ The geometry of a space is fundamentally based on the inner product operator. Norms are given by \((x^Tx)^{\tfrac{1}{2}}\), angles are given by \(y^Tx = \vert|x\vert|_2 \vert|y\vert|_2 \cos \theta\), etc; and thus orthonormal transformations do not change metric properties on a space. Because of this we say that orthonormal transforms are an isometric (same-metric) map or just an isometry.
Intuition: Rotations + ReflectionsGeometrically, orthonormal transformations represent rotations and reflections. Pure rotations have a determinant of \(+1\); rotation+reflections have a determinant of \(-1\). (More later on this.)
Compositions are OrthonormalFinally, we can show that composition of two orthonormal transformations is also orthonormal. Intuitively, two rotations form a composite rotation. $$ (RR')^T(RR') = R'^TR^TRR' = R'^TR' = I $$
We note that since both the columns and rows are orthonormal, it is natural to visualize a rotation matrix as a set of orthogonal columns (or rows) on the unit circle as illustrated below. We can see that left multiplying a matrix \(A\) by \(R\) transforms each of the standard basis vectors to a column of \(R\) and in so doing performs a rotation on each column of the matrix. We illustrate this in the figure below.
Right multiplying takes linear combinations of the columns of \(A\). Since each column of \(R\) is a unit vector (on the unit sphere) we see that the columns of \(AR\) remain on the transformed unit sphere as illustrated below.
If we observe the inverting and transposing visualizations together, we see that they give the same result for an orthonormal matrix \(R\)
Finally we reproduce several of the inverse visualizations for \(R\) orthonormal to provide extra intuition.