Term
Algebraic multiplicity for eigenvalue λ |
|
Definition
How many times it occurs in characteristic equation. |
|
|
Term
|
Definition
It is the basis of the vectors that solve Ax = λx (so if x = (1,0,0) and (0,0,1) then the basis is {1,0,0},{0,0,1}) |
|
|
Term
|
Definition
n x n matrix with all non negative entries and columns adding to one |
|
|
Term
Let A be a Markov matrix. Then |
|
Definition
1 is an eigenvalue of A and any other eigenvalue λ satisfies |λ|≤1 |
|
|
Term
If A is a matrix with real entries and λ is a complex eigenvalue, |
|
Definition
The conjugate of λ is also a complex eigenvalue |
|
|
Term
A linear system has either |
|
Definition
one unique solution or no solution or infinitely many solutions. |
|
|
Term
|
Definition
|
|
Term
Pivot variable vs free variable |
|
Definition
Pivot variable is variable in pivot column Free Variable is every other variable. |
|
|
Term
A linear system is consistent if ___ |
|
Definition
Does not have a row like [0,0,0 | b] (b is a non zero number). |
|
|
Term
Determine if w is in Span(u,v) (u,v,w are vectors)? |
|
Definition
Check if linear system corresponding to u,v|w is consistent (write as augmented form with vectors as columns) |
|
|
Term
|
Definition
|
|
Term
m x n matrix (columns, rows) |
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
Which rules apply for matrix multiplication? Which don't? |
|
Definition
(a) A (BC) = (AB)C (associative law of multiplication) (b) A (B + C) = AB + AC , (B + C) A = BA + CA (distributive laws)
AB != BA. Matrix multiplication is not commutative. |
|
|
Term
elementary matrix, premutation matrix, and how they relate |
|
Definition
elementary matrix - obtained by performing a single elementary row operation on an identity matrix.
permutation matrix - obtained by performing row exchanges on an identity matrix |
|
|
Term
Determine if a matrix A has LU decomp |
|
Definition
If A can be transformed into echelon form without the use of row exchanges, then A has LU factorization. |
|
|
Term
|
Definition
1) Find U using rowops
2) Take the opposite sign of row ops and insert correctly. |
|
|
Term
Solve Ax = b using LU decomp |
|
Definition
1) Solve Lc = b 2) Solve Ux = c |
|
|
Term
Suppose A and B are invertible |
|
Definition
AB is invertible. (AB)^-1 = B^-1 A^-1 A^T and B^T are invertible |
|
|
Term
|
Definition
set of objects you can linear combine
(ex: matricies, polynomials) |
|
|
Term
|
Definition
H is a subset of V if:
It shares a 0 vector with V Sums of things in H are in H cU is in H if U is in H |
|
|
Term
Nullspace (what it is and how to find) |
|
Definition
Set of solutions to Ax = 0
To find
1) REF augmented matrix with 0 ([A|0)
2) Write solution as linear combination (ex:
(x1,x2,x3) = (1,2,3)x1 + ...
3) Nul(A) = Span (lin comb) |
|
|
Term
|
Definition
|
|
Term
|
Definition
Augment with a vector {b1,b2,b3} and see if consistent |
|
|
Term
A single non-zero vector v1 is always linearly _____ |
|
Definition
|
|
Term
Nul(A) and solutions to Ax = b |
|
Definition
Let Axp = b
xp + Nul(A) will give all solutions in Nullspace. |
|
|
Term
The columns of A are linearly independent means ___ (3 things) |
|
Definition
Ax = 0 has only the solution x = 0. Nul(A) = {0} A has n pivots |
|
|
Term
Vectors v1, . . . , vp containing the zero vector are linearly ____ |
|
Definition
|
|
Term
A set of vectors{v1, . . . ,vp} in V is a basis of V if |
|
Definition
V = span(vectors) vectors linearly independent |
|
|
Term
|
Definition
it has a basis of p vectors |
|
|
Term
To be a basis of R^n the set must |
|
Definition
|
|
Term
The dimension of V is the |
|
Definition
number of elements in the basis |
|
|
Term
A basis for Col(A) is given by |
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
Find T with respect to std basis |
|
Definition
[T(e1) T(e2) T(e3)] where T(e1), T(e2), ... are the columns of the matrix.
Example:
[image] [image] |
|
|
Term
|
Definition
1) Obtain each vector in B as a linear combination of C.
2) Use the "coordinates" from your combination as columns.
[T(x1)_C, T(x2)_C, ...] |
|
|
Term
|
Definition
same as dot product betweeen 2 vectors (ex: v transpose w) |
|
|
Term
|
Definition
vectors that are unit vectors and orthogonal |
|
|
Term
|
Definition
space of all vectors that are orthogonal to the subspace W |
|
|
Term
Nul(A) is the orthogonal complement of |
|
Definition
|
|
Term
Col(A) is the orthogonal complement of |
|
Definition
|
|
Term
Find all vectors orthogonal to v1 and v2 |
|
Definition
Find the orthogonal complement of Col(v1 v2)
Can orthogonal complement using Nul(A^T) |
|
|
Term
|
Definition
Dimension: m edges x n nodes
A(i,j)=
−1, if edge i leaves node j +1 if edge i enters node j 0 otherwise |
|
|
Term
Meaning of nullspace of incidence matrix |
|
Definition
dim(Nul(A)) is number of connected subgraphs |
|
|
Term
Meaning of left nullspace of incidence matrix |
|
Definition
left nullspace = null(A^T)
dim(Nul(A^T)) is # of independent loops |
|
|
Term
Orthogonal basis definition |
|
Definition
If the vectors are orthogonal to each other. |
|
|
Term
Orthogonal projection of vector x ONTO vector y |
|
Definition
|
|
Term
|
Definition
Used to project x into y by using Px
Find using: [image] |
|
|
Term
|
Definition
|
|
Term
Orthogonal projection of x onto W |
|
Definition
Determined by xHat
[image]
Once xˆ is determined, x⊥ (error term) = x − xˆ. |
|
|
Term
|
Definition
|
|
Term
Closest point to x in span(v1,v2,..) |
|
Definition
Follow this formula. Resulting vector/pt is closest)
[image] |
|
|
Term
Find least squares solution to Ax = b (and define meaning of soln) |
|
Definition
Solve [image]
[image] is minimal |
|
|
Term
Projection matrix for proj onto Col(A) |
|
Definition
|
|
Term
|
Definition
Solve for beta:
[image]
where:
[image]
[image] [image]
Line is [image] |
|
|
Term
|
Definition
|
|
Term
The columns of Q are orthonormal means ___ |
|
Definition
|
|
Term
Given a basis a1, . . . , an, produce a orthogonal basis b1, . . . , bn and an orthonormal basis q1, . . . , qn. |
|
Definition
|
|
Term
Least square solution using QR decomp (Ax = b) |
|
Definition
Find QR decomp.
Rx = Q^T b, x will be best sol'n |
|
|
Term
Determinant of matrix (non std way) |
|
Definition
1) Get into upper triangular matrix 2) multiply diagonal |
|
|
Term
|
Definition
1) Gram-Schmidt on columns of A to get columns of Q 2) R = Q^T*A
Q is orthonormal, R is upper triangular |
|
|
Term
|
Definition
I_C,B x_b
x_b is the coordinate vectors of adding from basis b. I_C,B is special case of T_C,B where I(v) = v |
|
|
Term
3x3 matrix with detA = 5 det(2A) = ? |
|
Definition
|
|
Term
|
Definition
Ax = λx (x is a eigenvector, λ is eigen value) |
|
|
Term
Solve for eigenvalues (λ) of matrix A |
|
Definition
det(A - λ * Identity) = 0 |
|
|
Term
Diagonal matrix A^100
(a,0,0) (0,b,0) = A (0,0,c) |
|
Definition
(a^100,0,0) (0,b^100,0) (0,0,c^100) |
|
|
Term
Diagonalize A and find A^n |
|
Definition
1) Find eigenvectors 2) Find P = I_e,b (e is std basis, b is eigenvector basis)
3) Find D matrix using eigenvalues (eigenvalues down diagonal) 4) A^n = P D^n P^-1 (where D is diagonal) |
|
|
Term
An n × n matrix A is diagonalizable if and only if _____ |
|
Definition
A has n linearly independent eigenvectors |
|
|
Term
|
Definition
If A is symmetric (A = A^T), then it has an orthonormal basis of eigenvectors and all eigenvalues are real |
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
|
|
Term
|
Definition
Sigma not eigenvalues (eigenvalues of A^T A not A) SVD not unique. |
|
|