Let u,v, and w be vectors in Rn, and let c be a scalar. Then:
a. u⋅v=v⋅u
b. (u+v)⋅w=u⋅w+v⋅w
c. (cu)⋅v=c(u⋅v)=u⋅(cv)
d. u⋅u≥0, and u⋅u=0 if and only if u=0.
Theorem
Two vectors u and v are orthogonal if and only if ∥u+v∥2=∥u∥2+∥v∥2.
Theorem
1. A vector x is in W⊥ if and only if x is orthogonal to every vector in a set that spans W.
2. W⊥ is a subspace of Rn.
Theorem
Let A be an m×n matrix. The orthogonal complement of the row space of A is the nullspace of A, and the orthogonal complement of the column space of A is the nullspace of AT:
(Row A)⊥=Nul Aand(Col A)⊥=Nul AT
Theorem
u⋅v=∥u∥∥v∥cosθ
Theorem
If S={u1,…,up} is an orthogonal set of nonzero vectors in Rn, then S is linearly independent and hence is a basis for the subspace spanned by S.
Theorem
Let {u1,…,up} be an orthogonal basis for a subspace W of Rn. For each y∈W, the weights in the linear combination
y=c1u1+⋯+cpup
are given by
cj=uj⋅ujy⋅uj,(j=1,…,p).
Theorem
An m×n matrix U has orthonormal columns if and only if UTU=I.
Theorem
Let U be an m×n matrix with orthonormal columns, and let x and y be in Rn. Then:
a. ∥Ux∥=∥x∥
b. (Ux)⋅(Uy)=x⋅y
c. (Ux)⋅(Uy)=0 if and only if x⋅y=0.
The Orthogonal Decomposition Theorem
Let W be a subspace of Rn. Then each y∈Rn can be written uniquely in the form
y=y^+z
where y^ is in W and z is in W⊥. In fact, if {u1,…,up} is any orthogonal basis of W, then
y^=u1⋅u1y⋅u1u1+⋯+up⋅upy⋅upup
and
z=y−y^.
Theorem
If y is in W=Span{u1,…,up}, then projWy=y.
The Best Approximation Theorem
Let W be a subspace of Rn, y any vector in Rn, and y^ the orthogonal projection of y onto W. Then y^ is the closest point in W to y, in the sense that
∥y−y^∥<∥y−v∥
for all v∈W distinct from y^.
Theorem
If {u1,…,up} is an orthonormal basis for a subspace W of Rn, then
projWy=(y⋅u1)u1+(y⋅u2)u2+⋯+(y⋅up)up.
If U=[u1u2⋯up], then
projWy=UUTy,for all y∈Rn.