@Alllll0235
2017-06-18T13:36:52.000000Z
字数 7846
阅读 891
Linear Algebra
learning
1.1
- vector addition
- scalar multiplication
- the relation and distinction among line, plane and space
1.2
- dot product :
- length =
- unit vector :
- the angle between two vectors
- the dot product is when vector and are perpendicular
1.3
- matrix times vector
- inverse matrix and singular matrix
- independence and dependence
2.1
- matrix equation :
- identity matrix :
2.2
- upper triangular system
- the pivots are on the diagonal of the triangle after elimination
- elimination succeed if equation we get n pivots
2.3
- times column times column
- elimination matrix and exchange matrix
- augmented matrix
- when multiplies any matrix , it multiplies each column of separately
2.4
- the entry in row and column of is (row of ).(column of )
- matrix times column
- row times matrix
- block elimination
2.5
- the inverse exists if and only if elimination produces pivots
- if for a nonzero vector , then has no inverse
- the inverse of is the reverse product
- the steps of calculating
2.6
- the cost of elimination
2.7
- transpose :
if
- a permutation matrix has a 1 in each row and column, and
3.1
- the space consists of all column vectors with conponents
- (2 by 2 matrices) and (functions) and (zero vector alone) are vector spaces
- if and are vectors in the subspace abd is any scalar, then
- is in the subspace
- is in the subspace
- a subspace containing and must contain all linear combinations
- the system is solvable if and only if is in the column space of
3.2
- the nullspace is a subspace of . It contains all solutions to
- solving by elimination
- the complete solution to is a combination of the special solutions
- if then has at least one column without pivots, giving a special solution. So there are nonzero vectors in the nullspace of this rectangular .
3.3
- the rank of is the number of pivots
- those pivot columns are not combinations of earlier columns
- those free columns are combinations of earlier columns
3.4
- the rank is the number of pivots
- is solvable if and only if the last equations reduce to
- complete solution :
-
3.5
- the columns of are independent if is the only solution to
- a basis consists of linearly independent vectors that span the space
- the pivot columns are one basis for the column space
- this number of vectors in a basis is the dimension of the space
3.6
- the column space and row space both have dimension
- the nullspaces have dimensions
- the left nullspaces have dimensions
- (dimension of column space) + (dimension of nullspace) = dimension of
- every rank one matrix has the special form =column times row
4.1
- and are orthogonal complements if contains all vectors perpendicular to
- inside , the dimensions of complements and add to
- and are orthogonal complements; and are orthogonal complements
- any independent vectors in will span
- ( a row space vector and nullspace vector )
4.2
- when the matrix is
- when has full rank , the equation leads to and
- the projection matrix has and
- when A has independent columns, is square, symmertic, and invertible
4.3
- the best comes from the normal equations
- the least squares solution makes as small as possible
- the partily derivatives of are zero when
- the closest line has heights with errors
4.4
- when is quare, means that : transpose = inverse
- the length of equals the length of
- preserves dot products:
- if is square then and every
- the Gram-Schmidt process
5.1
- the determinant is zero when the matrix has no inverse
- the determinant changes sign when two rows(or two columns) are exchanged
- the determinant of the n by n identity matrix is 1
- the determinant is a linear function of each row separately
- subtracting a multiple of one row from another row leaves det unchange
- if is singular then .If is invertible then
- (product of the pivots)
- (det )(det =det =1
5.2
- if no row exchanges are involved, mutiply pivots to find the determinant
- the determinant of that corner submatrix is
- det = sum over all column permutations
=
- the cofactor expansion is det =
5.3
- the volume of a box is |det |, when the box edges are the rows of
- the cross product is a vector with length .Its direction is perpendicular to and
6.1
- the basic equation is . the number is an eigenvalue of
- the number is an eigenvalue of if and only if is singular
- the eigenvalue of and and and , with the same eigenvectors
- the product of the eigenvalues equals the determinants. The sum of the eigenvalues equals the sum of the diagonal entries
- projections , reflections , rotations have special eigenvalues . Singualr matrices have .Triangular matrices have 's on their dialonal
6.2
- If has independent eigenvectors , they go into the columns of . is diagonalized by . and
- Solution for
- is diagonalizable if every eigenvalue has enough eigenvectors
6.3
- equations starting from the vector at
- approaches zero (stability) if every has negative real part
- the solution is always , with the matrix exponential
6.4
- A symmetric matrix has only real eigenvalues
- The eigenvectors can be chosen orthonormal
- with
- the number of positive eigenvalues of equals the number of positive pivots
- Every square matrix can be "triangularized" by
6.5
- Symmetric matrices that have positive eigenvalues
- The eigenvalues of A are positive is and only if and
- The eigenvalues of are posivive if and only if the pivots are positive: and
- A is positive definite if for every nonzero vector :
- is automatically definite if has independent columns
- The ellipse has its axes along the eigenvectors of . Lengths
6.6
- Let be any invertible matrix. Then is similar to
- Similar matrices have the same eigenvalues. Eigenvectors are multiplied by
- If has independent eigenvectors then is similar to (take )
6.7
- the numbers are the nonzero eigenvalues of and
- The orthonormal columns of and are eigenvectors of and
7.1
- Combinations to combinations:
7.2
- Suppose we know for the basis vectors . Then linearity prodeces for every other input vector
- The th column of is found by applying to the th basis vector
= combination of basis vector of
- If and represent and , and the output basis for is the input basis for , then the matrix represents the transformation
7.3
- when the input and output bases are eigenvectors of
- when those bases are eigenvectors of and
- The chooses an input basis of 's and an output basis of 's. Those orthonormal bases diagonalized . This is , and in matrix from
- Every real square matrix can be factored into , where is orthogonal and is symmetric positive semidefinite. If is invertible, is positive definite
归纳总结
- 前七章主要讲述了向量、矩阵、空间、空间的投影、矩阵的秩、特征值和特征向量、线性变换。
- 与学校所发教材相比,前四章内容大同小异,只是例题较少,但解题方法更为简单(如对零空间的求解、LU分解),措辞更为准确。
- 第五到七章有较多内容未学过,阅读起来比较困难,对知识点理解得比较浅。