Chapter 6 (Orthogonality and Least Squares): Inner product, length,and orthogonality (内积、长度和正交性)

本文为《Linear algebra and its applications》的读书笔记

The Inner Product

  • The number u T v \boldsymbol u^T \boldsymbol v uTv is called the inner product of u \boldsymbol u u and v \boldsymbol v v, and often it is written as u ⋅ v \boldsymbol u\cdot \boldsymbol v uv. This inner product is also referred to as a dot product (点积).
    在这里插入图片描述

在这里插入图片描述

  • Properties ( b b b) and ( c c c) can be combined several times to produce the following useful rule:
    在这里插入图片描述

The Length of a Vector

在这里插入图片描述

  • For any scalar c c c, the length of c v c\boldsymbol v cv is ∣ c ∣ |c| c times the length of v \boldsymbol v v. That is,
    在这里插入图片描述
  • A vector whose length is 1 1 1 is called a unit vector. If we divide a nonzero vector v \boldsymbol v v by its length—that is, multiply by 1 / ∥ v ∥ 1/\left \| \boldsymbol v\right \| 1/v—we obtain a unit vector u \boldsymbol u u. This process is sometimes called normalizing (单位化) v \boldsymbol v v.

Distance in R n \mathbb R^n Rn

  • Recall that if a a a and b b b are real numbers, the distance on the number line between a a a and b b b is the number ∣ a − b ∣ |a - b| ab. This definition of distance in R \mathbb R R has a direct analogue in R n \mathbb R^n Rn.

在这里插入图片描述

  • In R 2 \mathbb R^2 R2 and R 3 \mathbb R^3 R3, this definition of distance coincides with the usual formulas for the Euclidean distance between two points.

Orthogonal Vectors

正交向量

  • The rest of this chapter depends on the fact that the concept of perpendicular lines in ordinary Euclidean geometry has an analogue in R n \mathbb R^n Rn.
  • Consider R 2 \mathbb R^2 R2 or R 3 \mathbb R^3 R3 and two lines through the origin determined by vectors u \boldsymbol u u and v \boldsymbol v v. The two lines shown in Figure 5 are geometrically perpendicular if and only if the distance from u \boldsymbol u u to v \boldsymbol v v is the same as the distance from u \boldsymbol u u to − v -\boldsymbol v v.
    在这里插入图片描述The two squared distances are equal if and only if u ⋅ v = 0 \boldsymbol u\cdot \boldsymbol v=0 uv=0.

  • The following definition generalizes to R n \mathbb R^n Rn this notion of perpendicularity (or o r t h o g o n a l i t y \boldsymbol{orthogonality} orthogonality).
    在这里插入图片描述

Observe that the zero vector is orthogonal to every vector in R n \mathbb R^n Rn.


在这里插入图片描述

毕达哥拉斯定理 / 勾股定理

在这里插入图片描述

Orthogonal Complements

正交补

  • If a vector z \boldsymbol z z is orthogonal to every vector in a subspace W \mathbb W W of R n \mathbb R^n Rn, then z \boldsymbol z z is said to be orthogonal to W \mathbb W W . The set of all vectors z \boldsymbol z z that are orthogonal to W \mathbb W W is called the orthogonal complement of W \mathbb W W and is denoted by W ⊥ \mathbb W^\perp W (and read as “ W \mathbb W W perpendicular” or simply “ W \mathbb W W perp”).

在这里插入图片描述

( R o w A ) ⊥ = N u l A ; ( C o l A ) ⊥ = N u l A T (RowA)^\perp=NulA;(ColA)^\perp=NulA^T (RowA)=NulA;(ColA)=NulAT

在这里插入图片描述

In fact, Theorem 3 is sometimes called the Fundamental Theorem of Linear Algebra.

PROOF

  • The row–column rule for computing A x A\boldsymbol x Ax shows that if x \boldsymbol x x is in N u l A NulA NulA, then x \boldsymbol x x is orthogonal to each row of A A A. Since the rows of A A A span the row space, x \boldsymbol x x is orthogonal to R o w A RowA RowA. Conversely, if x \boldsymbol x x is orthogonal to R o w A RowA RowA, then A x = 0 A\boldsymbol x=\boldsymbol 0 Ax=0.
  • This proves the first statement of the theorem. Since this statement is true for any matrix, it is true for A T A^T AT . That is, the orthogonal complement of the row space of A T A^T AT is the null space of A T A^T AT . This proves the second statement, because R o w A T = C o l A RowA^T = ColA RowAT=ColA.
    在这里插入图片描述

d i m W + d i m W ⊥ = n dim W +dim W^\perp = n dimW+dimW=n

EXAMPLE

Let W W W be a subspace of R n \mathbb R^n Rn. Thus W ⊥ W^\perp W is also a subspace of R n \mathbb R^n Rn. Prove that d i m W + d i m W ⊥ = n dim W +dim W^\perp = n dimW+dimW=n.

SOLUTION

  • If W ≠ { 0 } W \neq \{\boldsymbol 0\} W={0}, let { b 1 , . . . , b p } \{\boldsymbol b_1,..., \boldsymbol b_p\} {b1,...,bp} be a basis for W W W. Let A A A be the p × n p \times n p×n matrix having rows b 1 T , . . . , b p T \boldsymbol b_1^T,...,\boldsymbol b_p^T b1T,...,bpT. It follows that W W W is the row space of A A A. Theorem 3 implies that W ⊥ = ( R o w A ) ⊥ = N u l A W^\perp= (Row A)^\perp= Nul A W=(RowA)=NulA and hence d i m W ⊥ = d i m N u l A dim W ^\perp= dim Nul A dimW=dimNulA. Thus, d i m W + d i m W ⊥ = d i m R o w A + d i m N u l A = n dim W + dimW^\perp = dim Row A+ dim Nul A = n dimW+dimW=dimRowA+dimNulA=n.
  • If W = { 0 } W =\{\boldsymbol 0\} W={0}, then W ⊥ = R n W^\perp =\mathbb R^n W=Rn, and the result follows.

EXERCISES

Explain why an equation A x = b A\boldsymbol x = \boldsymbol b Ax=b has a solution if and only if b \boldsymbol b b is orthogonal to all solutions of the equation A T x = 0 A^T\boldsymbol x = \boldsymbol 0 ATx=0.

SOLUTION

  • If A x = b A\boldsymbol x = \boldsymbol b Ax=b has a solution, then x T A T = b T . \boldsymbol x^TA^T=\boldsymbol b^T. xTAT=bT. Let x ′ \boldsymbol x' x be any solution of A T x = 0 A^T\boldsymbol x = \boldsymbol 0 ATx=0, then x T A T x ′ = b T x ′ = 0 , b ⋅ x ′ = 0 \boldsymbol x^TA^T\boldsymbol x'=\boldsymbol b^T\boldsymbol x'=0,\boldsymbol b\cdot\boldsymbol x'=0 xTATx=bTx=0,bx=0. Thus b \boldsymbol b b is orthogonal to all solutions of the equation A T x = 0 A^T\boldsymbol x = \boldsymbol 0 ATx=0.
  • If b \boldsymbol b b is orthogonal to all solutions of the equation A T x = 0 A^T\boldsymbol x = \boldsymbol 0 ATx=0, then b ∈ ( N u l A T ) ⊥ = C o l A \boldsymbol b\in(NulA^T)^\perp=ColA b(NulAT)=ColA. Thus A x = b A\boldsymbol x = \boldsymbol b Ax=b has a solution.

Angles in R 2 \mathbb R^2 R2 and R 3 \mathbb R^3 R3

  • If u \boldsymbol u u and v \boldsymbol v v are nonzero vectors in either R 2 \mathbb R^2 R2 or R 3 \mathbb R^3 R3, then there is a nice connection between their inner product and the angle θ \theta θ between the two line segments from the origin to the points identified with u \boldsymbol u u and v \boldsymbol v v. The formula is
    在这里插入图片描述
  • To verify this formula for vectors in R 2 \mathbb R^2 R2, consider the triangle shown in Figure 9.
    在这里插入图片描述
    By the law of cosines (余弦定理),
    在这里插入图片描述which can be rearranged to produce
    在这里插入图片描述The verification for R 3 \mathbb R^3 R3 is similar. When n > 3 n > 3 n>3, this formula may be used to define the angle between two vectors in R n \mathbb R^n Rn.
  • In statistics, for instance, the value of c o s θ cos \theta cosθ for suitable vectors u \boldsymbol u u and v \boldsymbol v v is what statisticians call a correlation coefficient (相关系数).
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值