Fundementals
Basics
Let \mathbb{R}^m be the space of column vectors with real entries, symbollically read
\mathbb{R}^m = \{vector : x_i \in \mathbb{R}\}
We use this to define a Linear combination
There are two ways of looking at a matrix. Suppose we have a m by n matrix A.
\begin{align*} A = \begin{pmatrix} A_{1,1} & \cdots & A_{m,1}\\ \vdots & & \vdots\\ A_{1,n} & \cdots & A_{m,n} \end{pmatrix} \end{align*}
We have the column view of A, and the row view of A. Write
We immeditally get the following properties
- A_{m\times n} = A^T_{n\times m}
- (A^T)^T = A
We can define a matrix
Matrix multiplication
We denote the standard basis of \mathbb{R}^m as
\begin{align*} e_1 = \begin{pmatrix} 1\\ 0\\ \vdots\\ 0 \end{pmatrix}, e_2 = \begin{pmatrix} 0\\ 1\\ \vdots\\ 0 \end{pmatrix}, e_m = \begin{pmatrix} 0\\ 0\\ \vdots\\ 1 \end{pmatrix} \end{align*}
Any vector in \mathbb{R}^m is a linear combination of these e_i’s, in a unique way. For example
\begin{align*} x = \begin{pmatrix} x_1\\ \vdots\\ x_m \end{pmatrix} = x_1 e_1 + \cdots + x_m e_m \end{align*}
We define the identity matrix
\begin{align*} Col_j I = e_j, I = \begin{pmatrix} 1 & & 0\\ & \ddots & \\ 0 & & 1 \end{pmatrix} \end{align*}
Dot product
If x,y \in \mathbb{R}^m, then
\begin{align*} x^T y = \begin{pmatrix} x_1 & \cdots & x_m \end{pmatrix} \begin{pmatrix} y_1\\ \vdots\\ y_n \end{pmatrix} = x \cdot y = \sum^m_{i = 1}x_iy_i \end{align*}
Subspaces
We can use these definitions to prove some facts about subspaces.
It might help to look at some examples
Suppose we have a matrix A with a rank of 1. This means that our \text{ColSp}(A) is 1 dimensional.
If the \text{ColSp}(A) is one dimensional, than so is \text{RowSp}(A). This is because
A = vc^T = expand this later
Matrix multiplication
Suppose we have two matricies A_{(m\times n)} and Z_{(l\times m)}. Recall the function definition of a matrix. Then
Z \circ A \colon \mathbb{R}^n \to \mathbb{R}^l
So Z \circ A is some (l \times n) matrix, which can be computed as follows.
\begin{align*} Z \circ A \begin{pmatrix} x_1\\ \vdots\\ x_n \end{pmatrix} &= Z\left(A \begin{pmatrix} x_1\\ \vdots\\ x_n \end{pmatrix}\right)\\ &= Z\left( x_1 \text{Col}_1 A + \cdots + x_n \text{Col}_n A \right)\\ &= \begin{pmatrix} Z\text{Col}_1 A & \cdots & Z \text{Col}_n A \end{pmatrix} \end{align*}
Thus we have a fomula to compose two matrices. Matrix multiplication is associative because it’s function composition. There are three ways that we multiply matrices, each giving the same result.
\text{Column view:} \quad ZA = (Z\text{Col}_1 A \cdots Z \text{Col}_n A)\\ \text{Row-Column view:} \quad (ZA)_{ij} = (\text{Row}_{i}Z) \cdot (\text{Col}_j A)\\ \text{Row view:} \quad ZA = \text{Row}_1(Z)A\\ \text{Column-Row view:} \quad (ZA)_{ij} = (\text{Col}_1 Z)(\text{Row}_1 A) + \cdots + (\text{Col}_m Z)(\text{Row}_m A)
Row-column decomposition
Let A = CR. Construct C as follows.
\begin{align*} \text{Col}_1 C &= \text{First non-$0$ column of $A$}\\ \text{Col}_2 C &= \text{First column of $A$ not parallel to } \text{Col}_1 A\\ &\vdots\\ \text{Col}_r C &= \text{First column of $A$ not in Span}\{\text{Col}_1C, \cdots, \text{Col}_{r-1} C\}\\ \end{align*}
Thus C is an (m \times r) matrix, whose column space is identitcal to that of A.