[期末总结]线性代数(Continued)

Linear Algebra and ItsApplications 3rd Edition

——DavidC.Lay

 

Preface:

    此书由浅入深,登堂入室,于是难免定理累赘,不够深入计算机图形层面。

(本书中指的向量的起点都是原点)

个人想法:

         对xi的操作   (操作对象)  (操作后的结果)

xn   []       []    =      []

行数限制最大维度,列向量确定实际维度

 

Chatpter 1 : Systems of LinearEquations

Definition:

多个 形如 A1x1+ A2x2 + … + Anxn =b 的组合

 

·解的情况:

Inconsistent-> 1、无解, n = 0

/ 2、唯一解,n = 1

Consistent   \ 3、无数解,n >= 2

 

MatrixNotation:

对于Ax = b,系数矩阵coefficientmatrix  [A]

 增广矩阵augmented matrix      [A b]

 

ElementaryRow Operations

1.Replacement,(倍加)把某一行换成它本身与另一行倍数的和

2.Interchange,(换行)把两行交换

3.Scaling,(数乘)某行的所有元素乘于非零常数

FundamentalProblem:Existence and Uniqueness

 

基本概念:

      阶梯型,简化阶梯型,先导元素,主元位置,主元列,主元,自由变量,基本变量

 

Theorem1:

   Each matrix is row equivalent to one and onlyone reduced echelon matrix.

 

行化简算法Row Reduction Algorithm:

   Forward phase, backward phase

结果:显式的通解

解集的参数向量表示【我们约定用自由变量作参数】

 

Theorem2:

   线性方程组相容 <=> 增广矩阵最右列不是主元列

   若方程组相容,则 (i)自由变量数 = 0,唯一解

(ii)自由变量数 > 0,无穷多解

 

列向量:仅含一列的矩阵,简称向量。

(注:(a, b)和[]只是不同的表示方法)

若向量相等,则各分量相等

 

线性组合:

y是v1,…,vnc1,…,cp 为权的线性组合

 

Span{v1,…,vn}:

表示v1,…,vn所有线性组合,称为v1,…,vn生成的Rn的子集

 

Theorem 3:

       Let A be an m x n matrix,then the following statements are logically equivalent.

a. For each b in, Ax = b has a solution

b. Each b in Rm is alinear combination of the columns ofA

c. The columns of A span Rm

d. A has a pivot position inevery row.

 

HomogeneousLinear Systems齐次线性方程组:

   Ax= 0

   trival solution零解(平凡解)

 

非齐次线性方程组:

   解集 =  特解 + 对应的线性方程组的任意一个通解

 

Linearindependence:

has the only trival solution

反之,则是线性相关(linear dependence)

 

Theorem 4:

   Any set {v1,…,vp} in Rn is linearly independent if p>n

Theorem5:

   If a set contains the zero vector, then theset is linearly dependent.

 

T:Rm->Rn, 则Rn为余定义域(domain)

           所有T(v)的集合为值域(range)

 

Matrix Transformation:

   对基底进行变换,就对整个图形变换。

Linear transformation:

1. cT(u) = T(cu)标量乘法

2. T(u + v) = T(u) + T(v)加法

Superposition principle:(叠加原理)

   T(c1v1 + … + cpvp)= c1T(v1) + … + cpT(vp)

RotationTransformation:

   []  旋转 α

单射:余定义域中的像只有一个原像与之对应,

即Ax = b 仅有一个解,即Ax = 0 只有平凡解di

满射:余定义域中每个元素都是像(余定义域=值域)

 

Chapter 3 : Determinant(square matrix)

Definition(略)

Cofactorexpansion 余因子展开式     Cij=(-1)i+jdetAij

Theorem1:

   If A is a triangular matrix, then det A isthe product of the entries on the main diagonal of A.

   三角形矩阵的行列式为主对角线元素之积

Theorem2: Row operation

   Let A |-> B

1.  Replacement, detB = detA

2.  Interchange, detB = -detA

3.Scaling. detB = k·detA

Theorem3:

   A is invertible, iff det A ≠ 0

Theorem4:

   det AT = det A

Theorem5:

   det(AB) = detA·detB

Theorem6:

则 xi  =  det Ai(b)  /  det A

Theorem7:

 其中 Cij为余因子式

Theorem8:

   若A为2阶方阵,则|detA|为A中的列确定的平行四边形的面积

   若A为3阶方阵,则|detA|为A中的列确定的平行六面体的体积

Theorem9:

   令 T: R2->R2是由2阶方阵确定的线性变换,则

   T{area of T(S)} = |detA|·{area of S}

同理,三阶方阵确定的T: R3->R3

 

Chapter 4 : Vector Space

Definition :

   A nonempty set V of vectors,on which are defined two operations—addtion and multiplication by scalars,satisfy closed.

   非空向量集V满足加法和数乘封闭。

 

Subspace :

a. The zero vector of V in H.

b. H is closed under addition

c. H is closed undermultiplication by scalars.

若H是V的子空间, 则 ①H是向量空间 ②dim H <= dim V

Exception:    {} zero subspace

Theorem: V中任意向量的组合,可张成V的子空间H

 

Column  Space :

   If A = [a1an], then ColA = Span{ a1an}

Null Space:

   NulA = { :n  & A=}

 

Integration Theorem:

1、   The null space of an m*n matrix A is a subspace of Rn

The column space  of m*n matrix A is a subspace of Rm

2、   dim NulA = free variables

dim ColA = pivotcolumns

3、   (Rank Theorem)A is m*n matrix, then

rankA + dim NulA =n  (dim ColA = rank A, kernel == nullspace)

 

Imply ->

dim NulA```free variable```lineardependence

dim ColA ```pivot column```linearindependence

 

Comment:Elementary row operationsdon’t affect the linear dependence relations among the columns of the matrix.

 

Basis : ~Column Space

   A spanning set as small aspossible

   A linearly independent set as large aspossible.

尽可能小的生成集,尽可能大的线性无关集。

 

Coordinate Map:

Every  in V can be represented uniquely with abasis.One-to-one linear transformation,in other words, isomorphism(同构)

 

行等价即行空间相同,阶梯型矩阵的非零行形成行空间的一组基。

 

Change of Basis:

P = [ [b1]…[b1] ]

C<-B                  c                c

用新坐标系的基来表示原基,其权重为新坐标

Algorithm(同步求解):

   [c1…cn | b1…bn]  ~  [I | PC<-B]

Application -> Markov Chain

 

 

 

 

 

 

 

 

 

Chapter 5 : Eigenvalues andEigenvectors (square matrix)

尽管 -> A可能往各个方向移动,但通常会有些特殊向量,A对这些向量的作用是简单的。

Definition :

     Aneigenvector of an n*n matrix A is a nonzero vector  such A=λ forsome scalar λ, which is called an eigenvalue,

respectively , corresponding to

A=λ存在非平凡解,特征向量所张成的空间为特征空间

Theorem 1 :

   Allthe eigenvalues of a triangular matrix are the entries on its main diagonal.

三角矩阵的特征值在主对角线上

Theorem 2 :

   If  v1…vn areeigenvectorsthat correspond to distinct eigenvalues λ1…λn of an n*n matrix A, then the set { v1…vn} is linearindependence.

不同特征值对应的特征空间不同

 

IMT(Continued):

Ais invertible iff its eigenvalue don’t include zero

 

The Characteristic Equation :

      det(A-λI) = 0

<=>λ is an eigenvalue of an n*n matrix A

 

Algebraic Multiplicity:

   Itsmultiplicity as a root of the characteristic equation.

 

Similarity :

If A = PBP-1, then A and B issimilar.

Changing A into B is called a similaritytransformation.

 

Theorem 3 :

   If n*n matrices A andB are similar, then they have the same eigenvalues(with the same multiplicities)

A和B相似,则特征值相同(重数相同)

 

Theorem 4: The diagonalizationTheorem

   If n*n matrix A isdiagonalizable, iff A has n linearly independent eigenvectors.(All is eigenvector basis) In this case, thediagonal entries of D are eigenvalues of A that correspond respectively, to theeigenvectors in P.P must be invertible.

   对角化: A 存在n个线性无关的特征向量,构成特征向量基

目标A = PDP-1,

 D是对角矩阵,且主对角线元素是A的特征值

 P中各列是D中各列特征值对应的特征向量,并检验是否线性无关

 验证AP ?= DP

 

Theorem 5: (Combine 2 and 4

   An n*n matrix with n distinct eigenvalues isdiagonalizable.

   含有n个不同特征值的n阶方阵,必可对角化

 

Theorem 6: DiagonalMatrix Representation

   Suppose A = PDP-1, where D is adiagonal n*n matrix. If βisthe basis for Rn formed from the columns of P,then D is the β-matrix.

 

Chapter 6 : Orthogonality and Least Squares

Definition

      u·v = uTv

(定义)If u·v = 0,then u,v areorthogonal

Theorem 1:

            (Row A)=Nul A  

正交集:互相正交的向量集

Theorem 2

   Let {u1, …, up} be an orthogonal basis for asubspace W of Rn.For each y in W,

         y= c1u1 + … + cpup , where ci

OrthogonalProjection:      +()

y在 W 中的投影 projL y    

projL y = c1u1 + … + cpup, where ci

Theorem 3:

      Orthonormalset U iff(等价) UT·U = I

Property:  |Ux| = |x|

   当方阵A是标准正交集,则AT = A-1

The Best Approximation Theorem:

Theorem 5: QR Factorization(因式分解)

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值