Material related to this page, as well as additional exercises, can be found in ALA Ch. 1.4, LAA Ch 2.5, and ILA Ch. 2.4. These notes are mostly based on ALA Ch 1.4.
In the above example, the entry (1,1) is 0, which is the first diagonal element, but cannot serve as a pivot. The “problem” is because x does not appear in the first equation, but this is actually a good thing because we already have an equation that has only two variables in it. Hence, we need to eliminate x from only one of the other two equations. Let us interchange the first two rows of (2):
which clearly does not change the solution set, but now we have a pivot at (1,1) and (2,2). Interchanging the first and second row is equivalent to swapping the first and second equation. Now, we can proceed as in Gaussian Elimination to zero out (3,1) and (3,2) using elementary row operations to get
The pivots are 2,2,29 and solving via back substitution yields the solution x=65,y=65,z=31.
This row swapping operation is pretty useful, and so we’ll make it another elementary row operation, in addition to the type 1 operation of adding rows of a matrix together that we’ve been relying on so far.
Square matrices A that can be reduced to upper triangular form with nonzeros along the diagonal via Gaussian Elimination with pivoting play a distingished role in linear algebra. As we’ll see shortly, when such matrices are used to define a system of linear equations Ax=b, we know that there will be a unique solution for any choice of right hand side b. Because these kinds of matrices are so important, we’ll give them a name and call them nonsingular. Why we call them this will become clearer later in the course, but for now, this will just be our name for “nice” matrices A as described below.
In contrast, a singular square matrix cannot be reduced to such upper triangular form
by such row operations, because at some stage in the elimination procedure the diagonal
pivot entry and all of the entries below it are zero.
We summarize the discussion about unique solutions of systems of equations for nonsingular A in the theorem below.
We are able to prove the “if” part of this theorem, since we know that when A is nonsingular, we know that Gaussian Elimination with pivoting succeeds, meaning that the solution exists and is unique. We’ll come back to prove the “only if” part later.
It’s important to remember then when linear algebra is applied in engineering, data science, economics, and scientific domains, it is done so using computers. Computers store data using bits (0s and 1s), and as such have limited precision. This means that computers have to eventually round off numbers, because they can’t keep an infinite number of decimal points. This matters because when the numbers you are working with span a very large range (i.e., some numbers are very big and others are very small), then even very small rounding errors introduced by the limits of your computer can cause significant errors in your solutions. Here we’ll look at a simple example of such numerical ill-conditionning, and see that by being cognizant of these issues, we can find a work around.
We’ll look at the simple system of two equations in two unknowns:
where, the exact solution is x=10,y=20 (you can check this). Now let’s assume that we have only have a very primitive computer for solving this system of equations that only retains three digits for any number we store in it. Of course, modern computers have much higher, but still finite, capacity. We can modify the exmaple below to produce similar issues on modern computers.
The actual augmented matrix is given below after making it upper triangular.
If we use back substitution with the rounded augmented matrix (7) to solve the system by, we get first get y=159−3190≈20.1. Well that’s not too bad, we’re only off by 0.1 in y. But, if we proceed to solve for x using this value of y, we get x≈−10! This is a big problem! A small error in y, i.e, 0.1, produced a huge error in x! The problem is because of the small pivot 0.01 at (1,1). The small error in y is scaled up when evaluating the first equation, where we have to divide the exression by the small pivot 0.01. If we interchanged the rows so that the pivot is 1 and proceed from there via Gaussian Elimination, the approximate solution is y≈20.1,x≈9.9, which is much more reasonable.
From Definition 1, every nonsingular matrix A can be reduced to upper triangular form using type 1 operations and type 2 operations. If we perform all the permutations first using a permutation matrix P, then we obtain a regular matrixPA that admits an LU factorization. Hence we can define the permuted LU factorization of a nonsingular matrix A as follows.
Given a permuted LU factorization of the matrix A, we can solve the system of equations by first doing forward substitution with Lz=Pb and back substitution with Ux=z.
Implementing the algorithms you learn in this class from scratch in NumPy is a great way to both gain a deeper understanding of the math and to practice your coding skills. However, when applying these ideas in real word situations, it is usually better to use built-in functions: these typically have 100s, if not 1000s, of engineering hours behind them, and have been optimized for reliability, performance, and scalability. In the code snippet below, we use the standard scientific computing package SciPy’s built in lu implementation for the LU-factorization (with pivots). Note that this implementation is slightly different than the one described in our notes, and instead computes P,L,U such that A=PLU. By the end of the next section on matrix inverses, you’ll know how to relate this expression to the one we described above.
# Permuted LU factorization
import numpy as np
from scipy.linalg import lu
A = np.array([[0, 2, 1],
[2, 6, 1],
[1, 1, 4]])
P, L, U = lu(A)
print("P: \n", P, "\nL: \n", L, "\nU: \n", U)
print(f'A - P L U =\n {A - P @ L @ U}')
# Let's try a bigger example! How big can you make n before it takes too long to solve?
n = 2000
A = np.random.randn(n,n)
P, L, U = lu(A)
print(f'$max|(A - P L U)_ij| =\n {np.max(np.abs(A - P @ L @ U))}')