Learn Linear Algebra

Theorem

Let u,v, and w \vec{u}, \vec{v}, \text{ and } \vec{w} be vectors in Rn \mathbb{R}^n , and let c c be a scalar. Then:

a. uv=vu \vec{u} \cdot \vec{v} = \vec{v} \cdot \vec{u}
b. (u+v)w=uw+vw (\vec{u} + \vec{v}) \cdot \vec{w} = \vec{u} \cdot \vec{w} + \vec{v} \cdot \vec{w}
c. (cu)v=c(uv)=u(cv) (c\vec{u}) \cdot \vec{v} = c(\vec{u} \cdot \vec{v}) = \vec{u} \cdot (c\vec{v})
d. uu0 \vec{u} \cdot \vec{u} \geq 0 , and uu=0 \vec{u} \cdot \vec{u} = 0 if and only if u=0 \vec{u} = \vec{0} .

Theorem

Two vectors u \vec{u} and v \vec{v} are orthogonal if and only if u+v2=u2+v2 \|\vec{u} + \vec{v}\|^2 = \|\vec{u}\|^2 + \|\vec{v}\|^2 .

Theorem

1. A vector x \vec{x} is in W W^\perp if and only if x \vec{x} is orthogonal to every vector in a set that spans W W .

2. W W^\perp is a subspace of Rn \mathbb{R}^n .

Theorem

Let A A be an m×n m \times n matrix. The orthogonal complement of the row space of A A is the nullspace of A A , and the orthogonal complement of the column space of A A is the nullspace of AT A^T :

(Row A)=Nul Aand(Col A)=Nul AT (\text{Row } A)^\perp = \text{Nul } A \quad \text{and} \quad (\text{Col } A)^\perp = \text{Nul } A^T

Theorem

uv=uvcosθ \vec{u} \cdot \vec{v} = \|\vec{u}\| \|\vec{v}\| \cos \theta

Theorem

If S={u1,,up} S = \{ \vec{u}_1, \dots, \vec{u}_p \} is an orthogonal set of nonzero vectors in Rn \mathbb{R}^n , then S S is linearly independent and hence is a basis for the subspace spanned by S S .

Theorem

Let {u1,,up} \{ \vec{u}_1, \dots, \vec{u}_p \} be an orthogonal basis for a subspace W W of Rn \mathbb{R}^n . For each yW \vec{y} \in W , the weights in the linear combination y=c1u1++cpup \vec{y} = c_1 \vec{u}_1 + \cdots + c_p \vec{u}_p are given by cj=yujujuj,(j=1,,p). c_j = \frac{\vec{y} \cdot \vec{u}_j}{\vec{u}_j \cdot \vec{u}_j}, \quad (j = 1, \dots, p).

Theorem

An m×n m \times n matrix U U has orthonormal columns if and only if UTU=I U^T U = I .

Theorem

Let U U be an m×n m \times n matrix with orthonormal columns, and let x \vec{x} and y \vec{y} be in Rn \mathbb{R}^n . Then:

a. Ux=x \|U\vec{x}\| = \|\vec{x}\|
b. (Ux)(Uy)=xy (U\vec{x}) \cdot (U\vec{y}) = \vec{x} \cdot \vec{y}
c. (Ux)(Uy)=0 (U\vec{x}) \cdot (U\vec{y}) = 0 if and only if xy=0 \vec{x} \cdot \vec{y} = 0 .

The Orthogonal Decomposition Theorem

Let W W be a subspace of Rn \mathbb{R}^n . Then each yRn \vec{y} \in \mathbb{R}^n can be written uniquely in the form y=y^+z \vec{y} = \hat{\vec{y}} + \vec{z} where y^ \hat{\vec{y}} is in W W and z \vec{z} is in W W^\perp . In fact, if {u1,,up} \{ \vec{u}_1, \dots, \vec{u}_p \} is any orthogonal basis of W W , then y^=yu1u1u1u1++yupupupup \hat{\vec{y}} = \frac{\vec{y} \cdot \vec{u}_1}{\vec{u}_1 \cdot \vec{u}_1} \vec{u}_1 + \cdots + \frac{\vec{y} \cdot \vec{u}_p}{\vec{u}_p \cdot \vec{u}_p} \vec{u}_p and z=yy^. \vec{z} = \vec{y} - \hat{\vec{y}}.

Theorem

If y \vec{y} is in W=Span{u1,,up} W = \text{Span} \{ \vec{u}_1, \dots, \vec{u}_p \} , then projWy=y \text{proj}_W \vec{y} = \vec{y} .

The Best Approximation Theorem

Let W W be a subspace of Rn \mathbb{R}^n , y \vec{y} any vector in Rn \mathbb{R}^n , and y^ \hat{\vec{y}} the orthogonal projection of y \vec{y} onto W W . Then y^ \hat{\vec{y}} is the closest point in W W to y \vec{y} , in the sense that yy^<yv \| \vec{y} - \hat{\vec{y}} \| < \| \vec{y} - \vec{v} \| for all vW \vec{v} \in W distinct from y^ \hat{\vec{y}} .

Theorem

If {u1,,up} \{ \vec{u}_1, \dots, \vec{u}_p \} is an orthonormal basis for a subspace W W of Rn \mathbb{R}^n , then projWy=(yu1)u1+(yu2)u2++(yup)up. \text{proj}_W \vec{y} = (\vec{y} \cdot \vec{u}_1)\vec{u}_1 + (\vec{y} \cdot \vec{u}_2)\vec{u}_2 + \cdots + (\vec{y} \cdot \vec{u}_p)\vec{u}_p. If U=[u1 u2  up] U = [ \vec{u}_1 \ \vec{u}_2 \ \cdots \ \vec{u}_p ] , then projWy=UUTy,for all yRn. \text{proj}_W \vec{y} = UU^T\vec{y}, \quad \text{for all } \vec{y} \in \mathbb{R}^n.