Term
|
Definition
A subset of a vector space that is: nonempty closed under addition closed under scalar multiplication |
|
|
Term
Solution to systems of equations as a plane |
|
Definition
Take 3 solutions. Select 1. Subtract the remaining two from that solution. Put a variable scalar on each difference, add the two, and also add the selected solution. |
|
|
Term
|
Definition
A process. If there is one nonzero value for every variable, it has a unique solution. If there are columns that contain no leading 1s, there are free variables, so multiply those scalars by new vectors, and solve for all variables. Free variables are simply 1 of that new vector. |
|
|
Term
|
Definition
Put the matrix for the systems into row-echelon form. If there are no free variables, it spans the entire space. Otherwise, it spans an n-dimensional space, such as a line, plane, half plane, etc. |
|
|
Term
Solution to linear systems in the normal way. |
|
Definition
Write a single solution as an ordered n-tuple of the value of each variable. If there are free variables in a heterogeneous systems, write it as the ordered n-tuple containing the coefficient variables unmodified, and require the where the expression(s) that equals zero equal zero. If there are free variables in another systems, do the same for the remaining expressions. If you have a line/plane etc, plug values in for the scalars, and write it as a set of ordered n-tuples instead. |
|
|
Term
|
Definition
Turn a set of vectors into a homogenous systems of equations for scalars of each vector. If there are no free variables, the set is linearly independent. |
|
|
Term
Inner product space anxioms |
|
Definition
The product of a vector with itself is positive or zero. Commutative law across elements holds. A scalar can be multiplied by any number inside for the same vector. distributes to and . |
|
|
Term
|
Definition
Square root of the dot product of a vector with itself. |
|
|
Term
Angle between two vectors |
|
Definition
Arccosine of the dot product of the two vectors divided by dot product of each norm of the vectors. |
|
|
Term
|
Definition
Two vectors with a dot product of zero. A set of vectors where every set of two different vectors is orthogonal, is an orthogonal set. |
|
|
Term
|
Definition
(2x3) x (3x4) = (2x4) Multiply each in a row by the equivalent in the other matrix's column, and add these products. That's the first. Changing the row is handled in the first matrix, changing the column is in the second. |
|
|
Term
|
Definition
A matrix multiplied with it on the left to yield an identity matrix of the original's columns. |
|
|
Term
|
Definition
A matrix multiplied with it on the right to yield an identity matrix of the original's row. |
|
|
Term
Inverse of a square matrix |
|
Definition
Perform Gaussian reduction on [A|I]. The result is [I|C]. C is the right and left inverse of A. |
|
|
Term
Adding vectors to a linearly independent set to get a basis. |
|
Definition
Try each standard basis vector. Check if it is a linear comination of all other vectors. If not, include it. Repeat until it spans. Then try additions of 2 standard basis vectors. |
|
|
Term
Reducing a set of linearly dependent vectors to a basis. |
|
Definition
Set up a systems. Reduce the matrix to row-echelon form. Remove any vectors corresponding to free variables. |
|
|
Term
|
Definition
For functions, multiply the two functions together and take the integral over the range given. |
|
|