M340L Final Exam Solution, May 7 2003

1. Let
and let
. The augmented matrix [*A b*] is row-equivalent to
.

a) Find all solutions to *Ax*=*b*. Express your answers in
parametric form.

The row-reduced equations read , , , while and are free ( and ). Thus

b) Find a basis for the column space of *A*.

The first, 2nd and 5th columns are pivot columns, so our basis is

c) Find a basis for the null space of *A*.

This is the same as part (a), only with zero on the right hand side. Our basis is .

d) Find a basis for .

This is the same thing as the null space of *A*, so the answer is
the same as (c).

2. For each of these matrices, (i) find the determinant, (ii) state whether
the matrix is invertible, and (iii) either find the inverse of the matrix
(if it is invertible) or find a nonzero solution to *Ax*=0 (if it isn't).

a)

The determinant is 0, the matrix is NOT invertible, and a nontrivial
solution to *Ax*=0 is
.

b) .

The determinant is -1 (expand about the first column, and then about
the last column), the matrix is invertible, and the inverse (obtained by row-reducting
[*A I*]) is
.

3. In the space of quadratic polynomials, let be the standard basis, and let be an alternate basis

a) Find the change-of-basis matrix
that converts from coordinates in the *B* basis to coordinates in
the *E* basis.

b) Find the change-of-basis matrix
that converts from coordinates in the *E* basis to coordinates in
the *B* basis.

(computed by row-reducing .)

c) Compute the coordinates, in the *B* basis, of the following four
vectors:

Just multiply
by the coordinates of these vectors in the *E* basis to get:

4. Let
be defined by
, where *p*' is the derivative of *p* with respect to *t*.
Find the matrix of this linear transformation relative to the standard
basis.

Since , and , the matrix is .

5. a) Find a
matrix *A* whose eigenvalues are 1, 0 and -1, and whose corresponding
eigenvectors are
,
and
.

where and . Computing and multiplying gives .

b) Compute .

. But
is easily seen to equal *D*, so
.

6. In this problem, we model the spread of an epidemic. Let *S*(*k*)
be the number of sick people in week *k*, and let *I*(*k*)
be the number of people who are infected, but not yet sick. Each week,
a sick person will infect 6 others, while an infected person will become
sick. (In our model, nobody ever recovers or dies). That is,

Letting , this boils down to .

a) Find the eigenvalues and corresponding eigenvectors of the matrix.

Eigenvalues are and and eigenvectors are and (or any nonzero multiple of these choices).

b) In the long run, what will be the ratio of sick to infected (but not yet sick) people?

In the long run, the coefficient of dominates, so the ratio of sick to infected approaches that of , namely 1:2.

c) If there are 3 sick people and 1 infected person in week zero, how
many sick and infected people will there be in week *k*?

, so .

7. Let , , and . Let .

a) Compute .

Since and are orthogonal, .

b) Find the distance from *b* to the plane *V*.

The distance is .

c) Find a least-squares solution to .

We have already seen that the projection of *b* is
, so our least-squares solution is
. You can also get this answer by solving
.

8. a) Find an orthogonal basis for the column space of .

b) Find the projection of onto this space.

The vectors
give an orthogonal basis for this column space, so
, and
. The vector *b* was already IN the column space, so its projection
is itself.

9. True-False. Indicate whether each of these statements is true or false. If a statement is sometimes true and sometimes false, write ``false''. You do NOT have to justify your answers. There is no penalty for wrong answers, so go ahead and guess if you are unsure of your answer.

a) The equation *Ax*=*b* has a solution if, and only if, *b*
is in the span of the columns of *A*.

True.

b) The equation *Ax*=*b* has a solution if, and only if, the
augmented matrix [*A b*] has a pivot position in each row.

False. It has a solution if there is NOT a pivot in the last COLUMN.

c) If the
matrix *A* has a pivot in each column, then the columns of *A*
are linearly independent.

True.

d) If the
matrix *A* has a pivot in each column, then the columns of *A*
spen
.

False. Having a pivot in each column implies the columns are linearly independent. Having a pivot in each ROW implies that they span .

e) Every linear transformation from to can be represented by an matrix.

True.

f) If *A*, *B* and *C* are matrices such that the product
*ABC* makes sense, then
.

True.

g) If the determinant of a square matrix *A* is zero, then *A*
is invertible.

False. If the determinant is zero, the matrix is NOT invertible.

h) Given vectors , the set of all linear combinations of these vectors is a subspace of .

True. The span of several vectors is always a subspace.

i) If two matrices are row-equivalent, then their column spaces have the same dimension.

True.

j) The row space of a matrix has the same dimension as the column space.

True. Both dimensions equal the rank of the matrix.

k) is a subspace of .

False. does not sit inside .

l) The range of *T*(*x*) = *Ax* is the same as the column
space of *A*.

True.

m) If
, then
is a basis for *H*.

False. They may not be linearly independent.

n) The dimension of the null space of a matrix is the number of free variables
in the equation *Ax*=0.

True.

o) If *A* is a
matrix, then the null space of *A* is at least 2-dimensional.

False. This would be true for a matrix, not a .

p) A number *c* is an eigenvalue of *A* if and only if *det*(*A*-*cI*)=0.

True.

q) If the characteristic polynomial of a
matrix *A* is
, then *A* is diagonalizable.

True. The eigenvalues are all distinct.

r) Every matrix has at least one eigenvalue, but it may be complex.

True.

s) If
, with *D* diagonal, then each column of *P* is an eigenvector
of *A*.

True.

t) If *Ax*=0, then *x* is in
.

False. *x* is in
.

u) If two nonzero vectors are orthogonal, they are linearly independent.

True.

v) If two nonzero vectors are linearly independent, they are orthogonal.

False. (e.g., and .)

w) If , where and , then .

True.

x) The equation *Ax*=*b* always has a least-squares solution,
no matter what *A* and *b* are.

True. Least-squares solutions always exist.

y) If
, then *Ax*=*b* has a unique least-squares solution.

False. Uniqueness has to do with the columns of *A* being linearly
independent.

Wed May 7 13:26:29 CDT 2003