Term
The row space of A is the same as the column space of AT. |
|
Definition
|
|
Term
If B is any echelon form of A, and if B has 3 nonzero rows, then the first 3 rows of A form a basis for Row A. |
|
Definition
false, row operations do not preserve dependence relations so it would wrong to say that because B has 3 linearly independent rows, that A has 3 linearly independent rows also. The first 3 rows of B form a basis for Row A. |
|
|
Term
The dimensions of the row space and the column space of A are the same, even if A is not square. |
|
Definition
|
|
Term
The sum of the dimensions of the row space and the null space of A equals the number of rows in A. |
|
Definition
false, the sum equals the number of columns in A. |
|
|
Term
On a computer, row operations can change the apparent rank of a matrix. |
|
Definition
|
|
Term
If B is any echelon form of A, then the pivot columns of B form a basis for the column space of A. |
|
Definition
false, the pivot columns of A form a basis for the column space of A. |
|
|
Term
Row operations preserve the linear dependence relations among the rows of A. |
|
Definition
false, row operations do not preserve the linear dependence relations among the rows of A. |
|
|
Term
The dimension of the null space of A is the number of columns of A that are not pivot columns. |
|
Definition
|
|
Term
The row space of AT is the same as the column space of A. |
|
Definition
|
|
Term
If A and B are row equivalent, then their row spaces are the same. |
|
Definition
|
|
Term
The columns of the change-of-coordinates matrix P C<--B are B-coordinate vectors of the vectors in C. |
|
Definition
false, it's the C-coordinate vectors of the vectors in the basis B |
|
|
Term
If V = Rn and C is the standard basis for V, then P C<--B is the same as the change-of-coordinates matrix PB |
|
Definition
|
|
Term
The columns of P C<--B are linearly independent. |
|
Definition
|
|
Term
If V = R2, B = {b1, b2}, and C = {c1, c2}, then row reduction of [c1 c2 b1 b2] to [I P] produces a matrix P that satisfies [X]B = P[X]C for all x in V. |
|
Definition
False, it satisfies [X]C = P[X]B |
|
|
Term
If Ax = λx for some vector x, then λ is an eigenvalue of A. |
|
Definition
false, the vector has to be nonzero |
|
|
Term
A matrix A is not invertible if and only if 0 is an eigenvalue of A. |
|
Definition
|
|
Term
A number c is an eigenvalue of A if and only if the equation (A - cI)x = 0 has a nontrivial solution. |
|
Definition
|
|
Term
Finding an eigenvector of A may be difficult, but checking whether a given vector is in fact an eigenvector is easy. |
|
Definition
|
|
Term
To find the eigenvalues of A, reduce A to echelon form |
|
Definition
false, to find eigenvalues of a matrix you can reduce the matrix to triangular form. Row reducing to reduced echelon form will help you find the eigenvectors |
|
|
Term
If Ax = λx for some scalar λ, then x is an eigenvector of A. |
|
Definition
false, the vector x has to be nonzero |
|
|
Term
If v1 and v2 are linearly independent eigenvectors, then they correspond to distinct eigenvalues |
|
Definition
false, two linearly independent vectors can correspond to the same eigenvalue |
|
|
Term
A steady-state vector for a stochastic matrix is actually an eigenvector. |
|
Definition
|
|
Term
The eigenvalues of a matrix are on its main diagonal |
|
Definition
false, the matrix has to be triangular |
|
|
Term
An eigenspace of A is a null space of a certain matrix |
|
Definition
true, this certain matrix is A - λI |
|
|
Term
The determinant of A is the product of the diagonal entries of A. |
|
Definition
false, this is only true if A is triangular |
|
|
Term
An elementary row operation on A does not change the determinant. |
|
Definition
false, an interchange of two rows changes the determinant which is an elementary row op. |
|
|
Term
|
Definition
|
|
Term
If λ+5 is a factor of the characteristic polynomial of A, then 5 is an eigenvalue of A. |
|
Definition
|
|
Term
If A is a square matrix with columns a1, a2, a3, then detA equals the volume of the parallelepiped determined by a1, a2, a3. |
|
Definition
false, the |detA| equals the volume of the parallelepiped determined by a1, a2, a3, not detA. |
|
|
Term
|
Definition
|
|
Term
The multiplicity of a root r of the characteristic equation of A is called the algebraic multiplicity of r as an eigenvalue of A |
|
Definition
|
|
Term
A row replacement operation on A does not change the eigenvalues. |
|
Definition
|
|
Term
A is diagonalizable if A = PDP-1 for some matrix D and some invertible matrix P. |
|
Definition
|
|
Term
If Rn has a basis of eigenvectors of A, then A is diagonalizable. |
|
Definition
|
|
Term
A is diagonalizable if and only if A has n eigenvalues, counting multiplicities. |
|
Definition
false, A has to have n distinct eigenvalues |
|
|
Term
If A is diagonalizable, then A is invertible. |
|
Definition
false, A can be diagonalizable and not invertible |
|
|
Term
A is diagonalizable if A has n eigenvectors. |
|
Definition
false, the eigenvectors have to linearly independent |
|
|
Term
If A is diagonalizable, then A has n distict eigenvalues. |
|
Definition
false, this is the converse of Theorem 6. A matrix can be diagonalizable and not have n distinct eigenvalues. |
|
|
Term
If AP=PD with D diagonal, then the nonzero columns of P must be eigenvectors of A. |
|
Definition
|
|
Term
If A is invertible, then A is diagonalizable. |
|
Definition
false, a matrix can be invertible and not diagonalizable |
|
|
Term
|
Definition
|
|
Term
For any scalar c, u·(cv) = c(u·v) |
|
Definition
|
|
Term
If the distance from u to v equals the distance from u to -v, then u and v are orthogonal. |
|
Definition
|
|
Term
For a square matrix A, vectors in Col A are orthogonal to vectors in Nul A |
|
Definition
false, vectors in Col A are not orthogonal to vectors in Nul A. Counterexample:
[ 1 1
0 0 ] |
|
|
Term
If vectors v1,,,,,vp span a subspace W and if x is orthogonal to each vj for j = 1....p, then x is in W[image]. |
|
Definition
|
|
Term
|
Definition
|
|
Term
For any scalar c, ||cv|| = c||v|| |
|
Definition
false, ||cv|| = |c| ||v||, not c||v|| |
|
|
Term
If x is orthogonal to every vector in a subspace W, then x is in W[image]. |
|
Definition
|
|
Term
If ||u||2 + ||v||2 = ||u + v||2, then u and v are orthogonal. |
|
Definition
|
|
Term
For an m x n matrix A, vectors in the null space of A are orthogonal to vectors in the row space of A. |
|
Definition
|
|