Skip to article frontmatterSkip to article content

2.3 Span and Linear Independence

The building blocks of subspaces

Dept. of Electrical and Systems Engineering
University of Pennsylvania

Binder

Lecture notes

1Reading

Material related to this page, as well as additional exercises, can be found in ALA Ch. 2.3 and LAA 4.1, 4.3.

2Learning Objectives

By the end of this page, you should know:

  • how to form linear combinations of vectors
  • a span of a set of vectors
  • what it means for a collection of vectors to be linearly dependent and independent
  • how to check for linear dependence and independence

3Span and Linear Independence

3.1Building Subspaces

One natural way of constructing a subspace is to start with some building blocks v1,,vkV\vv v_1, \dots, \vv v_k\in V from the vector space VV we are working in, and to consider all possible linear combinations of them.

Span

A plane and a line spanned by two vectors v1,v2\vv v_1, \vv v_2 is given in Figure 1. In the case of the line, there is some scalar cRc \in \mathbb{R} such that v1=cv2\vv v_1 = c \vv v_2.

This key fact is not hard to check using the properties of vector addition and scalar multiplication, but it is a surprisingly powerful tool for generating useful subspaces, and for checking if a vector vV\vv v \in V also lives in a subspace WW.

3.2Linear Independence and Dependence

Linear dependence captures a notion of “redundancy” in a collection of vectors.

The condition (7) implies that we can write at least one of the vi\vv v_i as a linear combination of the other vectors: hence, it does not add anything new to the span of the collection v1,,vk\vv v_1 , \ldots , \vv v_k.

4Checking Linear Independence in Rn\mathbb{R}^n

Given the task of checking whether v1,,vkRn\vv v_1 , \ldots , \vv v_k \in \mathbb{R}^n is linearly dependent, we start by constructing the n×kn \times k matrix A=[v1vk]A = \bm \vv v_1 & \ldots & \vv v_k \em with columns defined by vi\vv v_i. Using matrix-vector multiplication, we interpret (7) as

Ac=c1v1++ckvk, where c=[c1c2ck],A \vv c = c_1 \vv v_1 + \ldots + c_k \vv v_k, \ \textrm{where} \ \vv c = \bm c_1 \\ c_2 \\ \vdots \\ c_k \em,

that is, we write any linear combination of the vectors v1,,vk\vv v_1, \dots, \vv v_k weighted by coefficients c1,,ckc_1, \dots, c_k in terms of matrix multiplication. The above equation helps us relate linear algebraic systems with the geometry of the span of vectors.

Binder