[关闭]
@Alllll0235 2017-06-18T13:36:52.000000Z 字数 7846 阅读 891

Linear Algebra

learning


1.1

  1. vector addition
  2. scalar multiplication
  3. the relation and distinction among line, plane and space

1.2

  1. dot product :
  2. length =
  3. unit vector :
  4. the angle between two vectors
  5. the dot product is when vector and are perpendicular

1.3

  1. matrix times vector
  2. inverse matrix and singular matrix
  3. independence and dependence

2.1

  1. matrix equation :
  2. identity matrix :

2.2

  1. upper triangular system
  2. the pivots are on the diagonal of the triangle after elimination
  3. elimination succeed if equation we get n pivots

2.3

  1. times column times column
  2. elimination matrix and exchange matrix
  3. augmented matrix
  4. when multiplies any matrix , it multiplies each column of separately

2.4

  1. the entry in row and column of is (row of ).(column of )
  2. matrix times column
  3. row times matrix
  4. block elimination

2.5

  1. the inverse exists if and only if elimination produces pivots
  2. if for a nonzero vector , then has no inverse
  3. the inverse of is the reverse product
  4. the steps of calculating

2.6

  1. the cost of elimination

2.7

  1. transpose :
  2. if

  3. a permutation matrix has a 1 in each row and column, and

3.1

  1. the space consists of all column vectors with conponents
  2. (2 by 2 matrices) and (functions) and (zero vector alone) are vector spaces
  3. if and are vectors in the subspace abd is any scalar, then
    • is in the subspace
    • is in the subspace
  4. a subspace containing and must contain all linear combinations
  5. the system is solvable if and only if is in the column space of

3.2

  1. the nullspace is a subspace of . It contains all solutions to
  2. solving by elimination
  3. the complete solution to is a combination of the special solutions
  4. if then has at least one column without pivots, giving a special solution. So there are nonzero vectors in the nullspace of this rectangular .

3.3

  1. the rank of is the number of pivots
  2. those pivot columns are not combinations of earlier columns
  3. those free columns are combinations of earlier columns

3.4

  1. the rank is the number of pivots
  2. is solvable if and only if the last equations reduce to
  3. complete solution :

3.5

  1. the columns of are independent if is the only solution to
  2. a basis consists of linearly independent vectors that span the space
  3. the pivot columns are one basis for the column space
  4. this number of vectors in a basis is the dimension of the space

3.6

  1. the column space and row space both have dimension
  2. the nullspaces have dimensions
  3. the left nullspaces have dimensions
  4. (dimension of column space) + (dimension of nullspace) = dimension of
  5. every rank one matrix has the special form =column times row

4.1

  1. and are orthogonal complements if contains all vectors perpendicular to
  2. inside , the dimensions of complements and add to
  3. and are orthogonal complements; and are orthogonal complements
  4. any independent vectors in will span
  5. ( a row space vector and nullspace vector )

4.2

  1. when the matrix is
  2. when has full rank , the equation leads to and
  3. the projection matrix has and
  4. when A has independent columns, is square, symmertic, and invertible

4.3

  1. the best comes from the normal equations
  2. the least squares solution makes as small as possible
  3. the partily derivatives of are zero when
  4. the closest line has heights with errors

4.4

  1. when is quare, means that : transpose = inverse
  2. the length of equals the length of
  3. preserves dot products:
  4. if is square then and every
  5. the Gram-Schmidt process

5.1

  1. the determinant is zero when the matrix has no inverse
  2. the determinant changes sign when two rows(or two columns) are exchanged
  3. the determinant of the n by n identity matrix is 1
  4. the determinant is a linear function of each row separately
  5. subtracting a multiple of one row from another row leaves det unchange
  6. if is singular then .If is invertible then
    • (product of the pivots)
    • (det )(det =det =1

5.2

  1. if no row exchanges are involved, mutiply pivots to find the determinant
  2. the determinant of that corner submatrix is
  3. det = sum over all column permutations
    =
  4. the cofactor expansion is det =

5.3

  1. the volume of a box is |det |, when the box edges are the rows of
  2. the cross product is a vector with length .Its direction is perpendicular to and

6.1

  1. the basic equation is . the number is an eigenvalue of
  2. the number is an eigenvalue of if and only if is singular
  3. the eigenvalue of and and and , with the same eigenvectors
  4. the product of the eigenvalues equals the determinants. The sum of the eigenvalues equals the sum of the diagonal entries
  5. projections , reflections , rotations have special eigenvalues . Singualr matrices have .Triangular matrices have 's on their dialonal

6.2

  1. If has independent eigenvectors , they go into the columns of . is diagonalized by . and
  2. Solution for
  3. is diagonalizable if every eigenvalue has enough eigenvectors

6.3

  1. equations starting from the vector at
  2. approaches zero (stability) if every has negative real part
  3. the solution is always , with the matrix exponential

6.4

  1. A symmetric matrix has only real eigenvalues
  2. The eigenvectors can be chosen orthonormal
  3. with
  4. the number of positive eigenvalues of equals the number of positive pivots
  5. Every square matrix can be "triangularized" by

6.5

  1. Symmetric matrices that have positive eigenvalues
  2. The eigenvalues of A are positive is and only if and
  3. The eigenvalues of are posivive if and only if the pivots are positive: and
  4. A is positive definite if for every nonzero vector :
  5. is automatically definite if has independent columns
  6. The ellipse has its axes along the eigenvectors of . Lengths

6.6

  1. Let be any invertible matrix. Then is similar to
  2. Similar matrices have the same eigenvalues. Eigenvectors are multiplied by
  3. If has independent eigenvectors then is similar to (take )

6.7

  1. the numbers are the nonzero eigenvalues of and
  2. The orthonormal columns of and are eigenvectors of and

7.1

  1. Combinations to combinations:

7.2

  1. Suppose we know for the basis vectors . Then linearity prodeces for every other input vector
  2. The th column of is found by applying to the th basis vector
    = combination of basis vector of
  3. If and represent and , and the output basis for is the input basis for , then the matrix represents the transformation

7.3

  1. when the input and output bases are eigenvectors of
  2. when those bases are eigenvectors of and
  3. The chooses an input basis of 's and an output basis of 's. Those orthonormal bases diagonalized . This is , and in matrix from
  4. Every real square matrix can be factored into , where is orthogonal and is symmetric positive semidefinite. If is invertible, is positive definite

归纳总结

  1. 前七章主要讲述了向量、矩阵、空间、空间的投影、矩阵的秩、特征值和特征向量、线性变换。
  2. 与学校所发教材相比,前四章内容大同小异,只是例题较少,但解题方法更为简单(如对零空间的求解、LU分解),措辞更为准确。
  3. 第五到七章有较多内容未学过,阅读起来比较困难,对知识点理解得比较浅。
添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注