A matrix is always invertible between the subspaces
RIGHT INVERSESA matrix \(A \in \mathbb{R}^{m \times n}\) is right invertible if there exists a matrix \(A^r\) such that $$ A A^r = I_{m \times m} $$ First, note that for this to be possible \(A^r\) must be in \(\mathbb{R}^{n \times m}\). Second, note that if we want to solve the equation \(y=Ax\), then taking \(x=A^ry\) clearly gives a solution. This works for any \(y\) so it must be true that a required condition for a right inverse to exist is that the columns of \(A\) span all of \(\mathbb{R}^m\), or equivalently \(A\) has full row rank.
Right inverses are not (in general) unique. Given one right inverse \((A^r)'\) then we can construct a new right inverse \(A^r\) by adding elements in the nullspace of \(A\) to each column of \((A^r)'\). Compactly we can write this as $$ A^r = (A^r)' + NZ $$ where \(N \in \mathbb{R}^{n \times (n-m)}\) is a basis for the nullspace of \(A\) and the columns of \(Z \in \mathbb{R}^{(n-m)\times m}\) give the coordinates of the nullspace elements added to each column of \(A^r\). Note the similarities between this equation and general solutions to the equation \(y=Ax\) where \(A\) is fat. Since the above equation defines all possible right inverses, we also have that the set of right inverses defines a subspace in. \(\mathbb{R}^{n \times m}\)
Multiplicative CharacterizationConsider right multiplying \(A\) by a non-specific matrix \(B \in \mathbb{R}^{n \times m}\) to get the product \(AB \in \mathbb{R}^{m \times m}\). In general \(AB\) will be invertible. (This assumes the columns of \(B\) are linearly independent and the intersection between the span of \(B\) and the nullspace of \(A\) is just 0, both of which are generic conditions that we would have to be purposeful in order to violate. ) Given \(B\), we can then get a right inverse in the form $$ A^r = B(AB)^{-1} $$ Note that if we allow \(B\) to vary this also characterizes all right inverses.
We can connect the multiplicative and additive characterizations in the following way. Assuming \(A\) is full rank, construct bases for the range of \(A^T) and the nullspace of \(A\) as the columns of \(M \in \mathbb{R}^{n \times m}\) and \(N \in \mathbb{R}^{n \times (n -m)}\) respectively. (Assuming \(A\) is full row rank we can just take \(M\) to be \(A^T\); we will return to this later) Any \(B\) can then be written as $$ B = MU + NV $$ where \(U \in \mathbb{R}^{m \times m}\) and \(V \in \mathbb{R}^{(n-m)\times(m)}\) Note the required dimensions of \(U\) and \(V\) Note also that the condition that \(AB\) is invertible is equivalent to \(U\) being invertible since $$ AB = AMU + ANV = AMU $$ and if \(A\) has full row rank and \(M\) is a basis for the range of \(A^T\) then \(AM\) is invertible. Using this construction to write out the right inverse we get $$ A^r = B(AB)^{-1} = (MU + NV)(AMU + ANV)^{-1} = (MU + NV)(AMU)^{-1} $$ and simplify to obtain $$ A^r = M(AM)^{-1} + NVU^{-1}(AM)^{-1} $$ which is the additive characterization with \(Z = VU^{-1}(AM)^{-1}\) and the specific right inverse \((A^r)' = M(AM)^{-1}\)
We note that \(M(AM)^{-1}\) is actually a more general characterization than needed. For a full row rank \(A\), we can always write \(A^T = MW\) for some invertible \(W\). We then have that $$ M(AM)^{-1} = A^TW^{-1}(AA^TW^{-1})^{-1} = A^TW^{-1}W(AA^T)^{-1} = A^T(AA^T)^{-1} $$ so the specific solution can always just be written as \((A^r)' = A^T(AA^T)^{-1}\) (though the component in the nullspace \( NZ = NVU^{-1}(AM)^{-1} \) does depend on the choice of \(M\).
From these considerations, we note that the set of right inverses are affine in \(Z\) or in the product \(VU^{-1}\) but not in \(U\) and \(V\) separately. The equation $$ AMZ = VU^{-1} $$ gives a useful linear relationship between \(Z\) and \(VU^{-1}\) and a nonlinear relationship between \(Z\) and \(V\) and \(U\)