Skip to article frontmatterSkip to article content

2.5 Kernel and Image

For our next trick, we'll make this vector disappear

Dept. of Electrical and Systems Engineering
University of Pennsylvania

Binder

Lecture notes

1ReadingΒΆ

Material related to this page, as well as additional exercises, can be found in ALA Ch. 2.5 and LAA 4.2.

2Learning ObjectivesΒΆ

By the end of this page, you should know:

  • a kernel (null space) and image (column space) of a matrix
  • the realtionship between the null space and column space to solutions of linear systems
  • to apply the superposition principle for solving linear systems with different right-hand vectors

3IntroductionΒΆ

In applications of linear algebra, subspaces of Rn\mathbb{R}^n (and general vector spaces VV) typically arise from either

(a) set of all solutions to a system of linear equations of the form Ax=0A \vv x = \vv 0, called a homogeneous linear system, or (b) as the span of certain specified vectors.

(a) is known as the null space description of a subspace, while (b) is known as the column space or image description of a subspace.

We will see that these are intimately related to systems of linear equations.

4Null Space of a MatrixΒΆ

If we think of the function f(x)=Axf(\vv x) = A \vv x that maps x↦Ax\vv x \mapsto A \vv x, then Null(A)(A) is the subset of Rn\mathbb{R}^n that f(x)f(\vv x) maps to 0\vv 0.

Now, you may be wondering why are we calling Null(A)(A) the null space? That’s because Null(A)(A) is a subspace!

We test it as follows. Suppose that u,v∈\vv u, \vv v \in Null(A)(A) and c,d∈Rc, d \in \mathbb{R}. We need to check if cu+dv∈c \vv u + d \vv v \inNull(A)(A), i.e., is it true that A(cu+dv)=0?A (c \vv u + d \vv v) = \vv 0?

A(cu+dv)=cAu+dAv, (linearity of matrix multiplication)=c0+d0, (u,v∈Null(A). so Au=Av=0)=0\begin{align*} A(c \vv u + d \vv v) &= c A \vv u + d A \vv v, \ (\textrm{linearity of matrix multiplication}) \\ &= c \vv 0 + d \vv 0, \ (\vv u, \vv v \in \textrm{Null}(A). \ \textrm{so} \ A \vv u = A \vv v = \vv 0) \\ &= \vv 0 \end{align*}

From (3), Null(A)(A) is a vector space! If A∈RmΓ—nA \in \mathbb{R}^{m \times n}, then Null(A)(A) is a subspace of Rn\mathbb{R}^n (where x\vv x lived). This property leads to the following incredibly important superposition principle for solutions to homogeneous linear systems.

5Describing the Null SpaceΒΆ

There is no obvious relationship between the entries of AA and Null(A)(A). Rather, it is defined implicitly via the condition that x∈\vv x \in Null(A)(A) if and only if Ax=0A \vv x =\vv 0. However, if we compute the general solution to Ax=0A \vv x = 0, this will give us an explicit description of Null(A)(A). This can be accomplished via Gaussian Elimination.

6The Column Space of AAΒΆ

We have seen previously that we can write the matrix-vector product AxA \vv x as the linear combination

Ax=x1a1+xa2+…+xnan, A \vv x = x_1 \vv a_1 + x \vv a_2 + \ldots + x_n \vv a_n,

of the columns a,a2,…,an\vv a_, \vv a_2, \ldots , \vv a_n of A=[a1a2…an]A = \bm \vv a_1 & \vv a_2 & \ldots & \vv a_n\em weighted by the elements xix_i of x\vv x. By letting the coefficients x1,x2,…,xnx_1, x_2, \ldots, x_n vary, we can descsribe the subspace spanned by the columns of AA, aptly named the column space of AA.

7The Complete Solution to Ax=bA \vv x = \vv bΒΆ

With an understanding of Null(A)(A) and Col(A)(A), we can completely characterize the solution set to Ax=bA \vv x = \vv b.

We can specialize TheoremΒ 3 or square matrices, which allows us to characterize if AA is invertible via either its null space or column space.

8The Superposition PrincipleΒΆ

We already showed that for homogeneous systems Ax=0A \vv x = \vv 0, superposition lets us generate new solutions by combining known solutions. For inhomogeneous systems Ax=bA \vv x = \vv b, superposition lets us combine solutions for different RHS(b\vv b vectors).

Suppose we have solutions x1βˆ—\vv x_1^* and x2βˆ—\vv x_2^* to Ax=b1A \vv x = \vv b_1 and Ax=b2A \vv x = \vv b_2, respectively. Can I quickly build a solution to Ax=c1b1+c2b2A \vv x = c_1 \vv b_1 + c_2 \vv b_2 for some c1,c2∈Rc_1, c_2 \in \mathbb{R}?

The answer is superposition! Let’s try xβˆ—=c1x1βˆ—+c2x2βˆ—\vv x^* = c_1 \vv x_1^* + c_2 \vv x_2^*:

Axβˆ—=A(c1x1βˆ—+c2x2βˆ—)=c1(Ax1βˆ—)+c2(Ax2βˆ—)=c1b1+c2b2.\begin{align*} A \vv x^* &= A (c_1 \vv x_1^* + c_2 \vv x_2^*) = c_1 (A \vv x_1^*) + c_2 (A \vv x_2^*) \\ &= c_1 \vv b_1 + c_2 \vv b_2. \end{align*}

It worked! From (18), xβˆ—=c1x1βˆ—+c2x2βˆ—\vv x^* = c_1 \vv x_1^* + c_2 \vv x_2^* is a solution to Ax=c1b1+c2b2A \vv x = c_1 \vv b_1 + c_2 \vv b_2. This is again the power of linear superposition at play.

8.1The General formΒΆ

The above idea caneasily be extended to several RHSs (solutions to more than two b\vv b vectors).

If x1βˆ—,x2βˆ—,…,xkβˆ—\vv x_1^*, \vv x_2^*, \ldots, \vv x_k^* are solutions to Ax=b1,Ax=b2,…,Ax=bkA \vv x = \vv b_1, A \vv x = \vv b_2, \ldots, A \vv x = \vv b_k, then, for any choice of c1,c2,…,ck∈Rc_1, c_2, \ldots, c_k \in \mathbb{R}, a particular solution to

Ax=c1b1+c2b2+…+ckbkA \vv x = c_1 \vv b_1 + c_2 \vv b_2 + \ldots + c_k \vv b_k

is given by xβˆ—=c1x1βˆ—+c2x2βˆ—+…+ckxkβˆ—\vv x^* = c_1 \vv x_1^* + c_2 \vv x_2^* + \ldots + c_k \vv x_k^*. The general solution to (20) is then

x=xβˆ—+n=c1x1βˆ—+c2x2βˆ—+…+ckxkβˆ—+n,\vv x = \vv x^* + \vv n = c_1 \vv x_1^* + c_2 \vv x_2^* + \ldots + c_k \vv x_k^* + \vv n,

where n∈\vv n \in Null(A)(A).

Binder