Skip to article frontmatterSkip to article content

4.4 Orthogonal Projections and Orthogonal Subspaces

Splitting vectors into orthgonal pieces

Dept. of Electrical and Systems Engineering
University of Pennsylvania

Binder

Lecture notes

1Reading

Material related to this page, as well as additional exercises, can be found in ALA 4.4.

2Learning Objectives

By the end of this page, you should know:

  • orthogonal projection of vectors
  • orthogonal subspaces
  • relationship between dimensions of orthogonal subspaces
  • orthogonality of the fundamental matrix subspaces and how they relate to a linear system

3Introduction

We extend the idea of orthogonality between two vectors to orthogonality between subspaces. Our starting point is the idea of an orthogonal projection of a vector onto a subspace.

4Orthogonal Projection

Let VV be a (real) inner product space, and WVW \subset V be a finite dimensional subspace of VV. The results we resent are fairly general, but it may be helpful to think of WW as a subspace of V=RmV = \mathbb{R}^m.

Note from Definition 2 that this means v\vv v can be decomposed as the sum of its orthogonal projection wW\vv w \in W and the perpendicular vector zW\vv z \perp W that is orthogonal to WW, i.e., v=w+vw=w+z\vv v = \vv w + \vv v - \vv w = \vv w + \vv z.

When we have access to an orthonormal basis for WVW \subset V, constructing the orthogonal projection of vV\vv v \in V onto WW becomes quite simple as given below.

We will see shortly that orthogonal projections of a vector onto a subspace is exactly what solving a least-squares problem does, and lies at the heart of machine learning and data science.

However, before that, we will explore the idea of orthogonal subspaces, and see that they provide a deep and elegant connection between the four fundamental subspaces of a matrix AA and whether a linear system Ax=bA \vv x=\vv b has a solution.

5Orthogonal Subspaces

An important geometric notion present in Example 2 is the Orthogonal Complement that is defined below.

The orthogonal complement to a line is given in the figure below, which is discussed in Example 2.

Orthogonal subspaces

Another direct consequence of Figure 3 is that a subspace and its orthogonal complement have complementary dimensions.

In Example 3, WR3W \subset \mathbb{R}^3 is plane, with dimW=2W = 2. Hence, we can conclude that dimW=1W^{\perp} = 1, i.e., WW^{\perp} is a line, which is indeed what we saw previously.

6Orthogonality of the Fundamental Matrix Subspace

We previously introduced the four fundamental subspaces associated with an m×nm \times n matrix AA: the column, null, row, and left null spaces. We also saw that the null and row spaces are subspaces with complementary dimensions in Rn\mathbb{R}^n, and the left null space and column space are subspaces with complementary dimensions in Rm\mathbb{R}^m. Moreover, these pairs are orthogonal complements of each other with respect to the standard dot product.

We will not go through the proof (although it is not hard), but instead focus on a very important practical consequence.

Binder