MATH 240

Mon. November 18th, 2019


Norm (cont.)

We said before that defining an inner product u,v\langle u,v\rangle gives us a norm v=v,v\lVert v\rVert=\sqrt{\langle v,v\rangle} – a notion of magnitude.

For Rn\R^n, if the dot product uTvu^Tv is used as the inner product, then v=x12+x22+...+xn2\lVert v\rVert=\sqrt{x_1^2+x_2^2+...+x_n^2}, our common notion of Euclidean distance from the origin.

We can also extend this to find the distance from any other vector (not just the origin) by computing uv\lVert u - v \rVert.

However, when dealing with abstract vector spaces like C[0,1]C[0,1], we also get these notions of magnitude and distance – how "big" a function is and how "far" apart two functions are – even if it’s not intuitive at first.

By using 01fg\int_0^1fg as the inner product of two functions in C[0,1]C[0,1], the distance between them is now 01(fg)2\sqrt{\int_0^1(f-g)^2}. Now that we have a notion of distance between two functions, we can treat these functions as vectors and perform calculus on them – this is called functional analysis.


Orthogonality

Definition. Two vectors u,vu,v are orthogonal if u,v=0\langle u,v\rangle = 0. (You can visualize this as them being perpendicular to each other in the plane that contains them and the origin.)

For example, if you think of the standard basis of Rn\R^n, {(100...),(010...),...}\{\begin{pmatrix} 1\\0\\0\\... \end{pmatrix},\begin{pmatrix} 0\\1\\0\\... \end{pmatrix},...\} you can see that any two vectors in this set will be orthogonal to each other.

Also, every vector is orthogonal to the zero vector.


Orthogonal Complement

Definition. Let WW be a subspace of VV. A vector xx in VV is said to be orthogonal to WW if xx is orthogonal to every ww in WW. The collection of all vectors orthogonal to WW is denoted by WW^\perp, which is called the orthogonal complement of WW.

If V=WV=W, then WW^\perp contains only the zero vector.

Theorem. Let AA be an m×nm\times n matrix. The orthogonal complement of Row(A)\text{Row}(A) is Nul(A)\text{Nul}(A), and the orthogonal complement of Col(A)\text{Col}(A) is Nul(AT)\text{Nul}(A^T).


Theorem. If WW is spanned by {w1,...,wk}\{w_1,...,w_k\}, and xx is orthogonal to w1,...,wkw_1,...,w_k, then xx is in WW^\perp.

Ex. If V=R3V=\R^3 and W=span{(100),(010)}W=\text{span}\{\begin{pmatrix} 1\\0\\0 \end{pmatrix},\begin{pmatrix} 0\\1\\0 \end{pmatrix}\}, then (003)\begin{pmatrix} 0\\0\\3 \end{pmatrix} is in WW^\perp because it is orthogonal to both of those vectors.


Orthogonal Set

Definition. A set of vectors {u1,...,up}\{u_1,...,u_p\} is said to be orthogonal if each distinct pair of vectors ui,uju_i,u_j in the set are orthogonal to each other (and ui0u_i\neq0.)


Orthogonal Basis

Theorem. If S={u1,...,up}S=\{u_1,...,u_p\} is an orthogonal set of vectors, then SS is a basis for span(S)\text{span}(S) (i.e. u1,...,upu_1,...,u_p are linearly independent).

Definition. An orthogonal basis for a subspace WW of Rn\R^n is a basis for WW that is also an orthogonal set.

It turns out that it is easier to calculate the coordinates of points in an orthogonal basis than those in a non-orthogonal basis.

Theorem. Let {u1,...,un}\{u_1,...,u_n\} be an orthogonal basis for a subspace WW of Rn\R^n. For each yy in WW, the weights in the linear combination y=c1u1+...+cnuny=c_1u_1+...+c_nu_n are given by cj=yujujujc_j=\dfrac{y\cdot u_j}{u_j \cdot u_j}.