线性代数 - 1 - 基础知识

线性代数,基础知识,温故知新。

定义

  • 向量:

向量默认为列向量:

x → = ( x 1 , x 2 , ⋯   , x n ) T = [ x 1 x 2 ⋮ x n ] \overrightarrow{\mathbf{x}}=\left(x_{1}, x_{2}, \cdots, x_{n}\right)^{T}=\left[\begin{array}{c} x_{1} \\ x_{2} \\ \vdots \\ x_{n} \end{array}\right] x =(x1,x2,,xn)T=x1x2xn

  • 矩阵 X ∈ R m × n \mathbf{X} \in \mathbb{R}^{m \times n} XRm×n,表示为:

X = [ x 1 , 1 x 1 , 2 ⋯ x 1 , n x 2 , 1 x 2 , 2 ⋯ x 2 , n ⋮ ⋮ ⋱ ⋮ x m , 1 x m , 2 ⋯ x m , n ] \mathbf{X}=\left[\begin{array}{cccc} x_{1,1} & x_{1,2} & \cdots & x_{1, n} \\ x_{2,1} & x_{2,2} & \cdots & x_{2, n} \\ \vdots & \vdots & \ddots & \vdots \\ x_{m, 1} & x_{m, 2} & \cdots & x_{m, n} \end{array}\right] X=x1,1x2,1xm,1x1,2x2,2xm,2x1,nx2,nxm,n

范数

向量范数
1-范数

各个元素的绝对值之和

∥ X ∥ 1 = ∑ i = 1 n ∣ x i ∣ \|X\|_{1}=\sum_{i=1}^{n}\left|x_{i}\right| X1=i=1nxi

2-范数

每个元素的平方和再开平方根

∥ X ∥ 2 = ( ∑ i = 1 n x i 2 ) 1 2 = ∑ i = 1 n x i 2 \|X\|_{2}=\left(\sum_{i=1}^{n} x_{i}^{2}\right)^{\frac{1}{2}}=\sqrt{\sum_{i=1}^{n} x_{i}^{2}} X2=(i=1nxi2)21=i=1nxi2

p-范数

∥ X ∥ p = ( ∑ i = 1 n ∣ x i ∣ p ) 1 p \|X\|_{p}=\left(\sum_{i=1}^{n}\left|x_{i}\right|^{p}\right)^{\frac{1}{p}} Xp=(i=1nxip)p1

  • 其中正整数p≥1,并且有{% raw %} lim ⁡ p → ∞ ∥ X ∥ p = max ⁡ 1 ≤ i ≤ n ∣ x i ∣ \lim _{p \rightarrow \infty}\|X\|_{p}=\max _{1 \leq i \leq n}\left|x_{i}\right| limpXp=max1inxi.{% endraw %}
无穷范数

∥ X ∥ ∞ = max ⁡ 1 < i < n ∣ x i ∣ \|X\|_{\infty}=\max _ { 1 < i< n}\left|x_{i}\right | X=1<i<nmaxxi

为向量中绝对值最大的元素的值。

矩阵范数
1-范数(列模)

矩阵的每一列上的元素绝对值先求和,再从中取个最大的,(列和最大)
{% raw %}

∥ A ∥ 1 = max ⁡ 1 ≤ j ≤ n ∑ i = 1 n ∣ a i j ∣ \|A\|_{1}=\max _{1 \leq j \leq n} \sum_{i=1}^{n} \mid a_{i j}\mid A1=1jnmaxi=1naij

2-范数(谱模):

最大特征值开平方根:
{% raw %}

∣ ∣ A ∥ 2 = λ max ⁡ ( A T A ) = max ⁡ 1 ≤ i ≤ n ∣ λ i ∣ \mid\mid A \|_{2}=\sqrt{\lambda_{\max }\left(A^{T} A\right)}=\sqrt{\max _{1 \leq i \leq n}\left|\lambda_{i}\right|} A2=λmax(ATA) =1inmaxλi

无穷范数(行模)

矩阵的每一行上的元素绝对值先求和,再从中取个最大的,(行和最大)

∥ A ∥ ∞ = max ⁡ 1 ≤ i ≤ n ∑ j = 1 n ∣ a i j ∣ \|A\|_{\infty}=\max _{1 \leq i \leq n} \sum_{j=1}^{n} \mid a_{i j}\mid A=1inmaxj=1naij

L0范数

矩阵的非0元素的个数,通常用它来表示稀疏,L0范数越小0元素越多,也就越稀疏

L1范数:

矩阵中的每个元素绝对值之和,它是L0范数的最优凸近似,因此它也可以近似表示稀疏

F范数:

矩阵的各个元素平方之和再开平方根,它通常也叫做矩阵的L2范数,它的优点在它是一个凸函数,可以求导求解,易于计算

∥ A ∥ F = ∑ i , j a i , j 2 \|\mathbf{A}\|_{F}=\sqrt{\sum_{i, j} a_{i, j}^{2}} AF=i,jai,j2

行列式

  • 方阵 A A A 的行列式,记作 d e t ( A ) det(A) det(A) ∣ A ∣ |A| A

D = ∣ a 11 a 12 ⋯ a 1 n a 21 a 22 ⋯ a 2 n ⋯ ⋯ ⋯ ⋯ a n 1 a n 2 ⋯ a n n ∣ D=\left|\begin{array}{llll} a_{11} & a_{12} & \cdots & a_{1 n} \\ a_{21} & a_{22} & \cdots & a_{2 n} \\ \cdots & \cdots & \cdots & \cdots \\ a_{n 1} & a_{n 2} & \cdots & a_{n n} \end{array}\right| D=a11a21an1a12a22an2a1na2nann

  • 计算公式:

D = ∑ ( − 1 ) k a 1 k 1 a 2 k 2 ⋯ a n k n D=\sum (- 1) ^ {k } a_{1 k_{ 1} } a_{2 k_{2} } \cdots a_ { n k _ { n} } D=(1)ka1k1a2k2ankn

  • 表示的是 n n n n n n维向量构成的n维平行多面体的体积,该体积有正负,若存在线性相关的向量,行列式为0
  • 行列式 A A A中某行(或列)用同一数 k k k乘,其结果等于 k A kA kA
  • 行列式 A A A等于其转置行列式 A T A^T AT( A T A^T AT的第i行为 A A A的第 i i i列)
  • 行列式 A A A中两行(或列)互换,其结果等于 − A -A A
  • 把行列式 A A A的某行(或列)中各元同乘一数后加到另一行(或列)中各对应元上,结果仍然是 A A A

方阵的迹

  • 方阵 A = ( a i , j ) n × n \mathbf{A}=\left(a_{i, j}\right)_{n \times n} A=(ai,j)n×n的迹,记作 tr ⁡ ( A ) \operatorname{tr}(\mathbf{A}) tr(A),对角线元素之和,也等于特征值的和:

tr ⁡ ( A ) = ∑ i a i , i \operatorname{tr}(\mathbf{A})=\sum_{i} a_{i, i} tr(A)=iai,i

向量积

点积(Dot Product)

对应元素乘积和,结果不是一个向量,而是一个标量(Scalar)

A ⋅ B = ∑ i a i b i = ∣ A ∣ ∣ B ∣ C o s ( θ ) {\bf{A} } \cdot {\bf{B} } = \sum\limits_i { {a_i}{b_i} } = \left| A \right|\left| B \right|Cos\left( \theta \right) AB=iaibi=ABCos(θ)

叉乘(cross product)
三维向量的叉积:

用三阶行列式表示

u → = u x i → + u y j → + u z k → , v → = v x i → + v y j → + v z k → \overrightarrow{\mathbf{u}}=u_{x} \overrightarrow{\mathbf{i}}+u_{y} \overrightarrow{\mathbf{j}}+u_{z} \overrightarrow{\mathbf{k}}, \quad \overrightarrow{\mathbf{v}}=v_{x} \overrightarrow{\mathbf{i}}+v_{y} \overrightarrow{\mathbf{j}}+v_{z} \overrightarrow{\mathbf{k}} u =uxi +uyj +uzk ,v =vxi +vyj +vzk

其中 i → , j → , k → \overrightarrow{\mathbf{i}}, \overrightarrow{\mathbf{j}}, \overrightarrow{\mathbf{k}} i ,j ,k 分别为 x , y , z x, y, z x,y,z轴的单位向量。

w → = u → × v → = [ i → j → k → u x u y u z v x v y v z ] \overrightarrow{\mathbf{w}}=\overrightarrow{\mathbf{u}} \times \overrightarrow{\mathbf{v}}=\left[\begin{array}{ccc} \overrightarrow{\mathbf{i}} & \overrightarrow{\mathbf{j}} & \overrightarrow{\mathbf{k}} \\ u_{x} & u_{y} & u_{z} \\ v_{x} & v_{y} & v_{z} \end{array}\right] w =u ×v =i uxvxj uyvyk uzvz

  • u → , v → \overrightarrow{\mathbf{u}}, \overrightarrow{\mathbf{v}} u ,v 的叉积垂直于 u → , v → \overrightarrow{\mathbf{u}}, \overrightarrow{\mathbf{v}} u ,v 构成的平面,其方向符合右手规则
  • 叉积的模等于 u → , v → \overrightarrow{\mathbf{u}}, \overrightarrow{\mathbf{v}} u ,v 构成的平行四边形的面积
向量的并矢

给定两个向量{% raw %} x → = ( x 1 , x 2 , ⋯   , x n ) T , y → = ( y 1 , y 2 , ⋯   , y m ) T \overrightarrow{\mathbf{x}}=\left(x_{1}, x_{2}, \cdots, x_{n}\right)^{T}, \overrightarrow{\mathbf{y}}=\left(y_{1}, y_{2}, \cdots, y_{m}\right)^{T} x =(x1,x2,,xn)T,y =(y1,y2,,ym)T {% endraw %},则向量的并矢记作:

x → y → = [ x 1 y 1 x 1 y 2 ⋯ x 1 y m x 2 y 1 x 2 y 2 ⋯ x 2 y m ⋮ ⋮ ⋱ ⋮ x n y 1 x n y 2 ⋯ x n y m ] \overrightarrow{\mathbf{x}} \overrightarrow{\mathbf{y}}=\left[\begin{array}{cccc} x_{1} y_{1} & x_{1} y_{2} & \cdots & x_{1} y_{m} \\ x_{2} y_{1} & x_{2} y_{2} & \cdots & x_{2} y_{m} \\ \vdots & \vdots & \ddots & \vdots \\ x_{n} y_{1} & x_{n} y_{2} & \cdots & x_{n} y_{m} \end{array}\right] x y =x1y1x2y1xny1x1y2x2y2xny2x1ymx2ymxnym

也记作{% raw %} x → ⊗ y → \overrightarrow{\mathbf{x}} \otimes \overrightarrow{\mathbf{y}} x y {% endraw %}或者{% raw %} x → y → T \overrightarrow{\mathbf{x}} \overrightarrow{\mathbf{y}}^{T} x y T{% endraw %}。

矩阵运算

给定两个矩阵{% raw %} A = ( a i , j ) ∈ R m × n , B = ( b i , j ) ∈ R m × n \mathbf{A}=\left(a_{i, j}\right) \in \mathbb{R}^{m \times n}, \mathbf{B}=\left(b_{i, j}\right) \in \mathbb{R}^{m \times n} A=(ai,j)Rm×n,B=(bi,j)Rm×n {% endraw %},定义:

阿达马积(Hadamard product)(又称作逐元素积)

A ∘ B = [ a 1 , 1 b 1 , 1 a 1 , 2 b 1 , 2 ⋯ a 1 , n b 1 , n a 2 , 1 b 2 , 1 a 2 , 2 b 2 , 2 ⋯ a 2 , n b 2 , n ⋮ ⋮ ⋱ ⋮ a m , 1 b m , 1 a m , 2 b m , 2 ⋯ a m , n b m , n ] \mathbf{A} \circ \mathbf{B}=\left[\begin{array}{cccc} a_{1,1} b_{1,1} & a_{1,2} b_{1,2} & \cdots & a_{1, n} b_{1, n} \\ a_{2,1} b_{2,1} & a_{2,2} b_{2,2} & \cdots & a_{2, n} b_{2, n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m, 1} b_{m, 1} & a_{m, 2} b_{m, 2} & \cdots & a_{m, n} b_{m, n} \end{array}\right] AB=a1,1b1,1a2,1b2,1am,1bm,1a1,2b1,2a2,2b2,2am,2bm,2a1,nb1,na2,nb2,nam,nbm,n

克罗内积(Kronnecker product)

A ⊗ B = [ a 1 , 1 B a 1 , 2 B ⋯ a 1 , n B a 2 , 1 B a 2 , 2 B ⋯ a 2 , n B ⋮ ⋮ ⋱ ⋮ a m , 1 B a m , 2 B ⋯ a m , n B ] \mathbf{A} \otimes \mathbf{B}=\left[\begin{array}{cccc} a_{1,1} \mathbf{B} & a_{1,2} \mathbf{B} & \cdots & a_{1, n} \mathbf{B} \\ a_{2,1} \mathbf{B} & a_{2,2} \mathbf{B} & \cdots & a_{2, n} \mathbf{B} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m, 1} \mathbf{B} & a_{m, 2} \mathbf{B} & \cdots & a_{m, n} \mathbf{B} \end{array}\right] AB=a1,1Ba2,1Bam,1Ba1,2Ba2,2Bam,2Ba1,nBa2,nBam,nB

偏导数

  • 标量对标量的偏导数: ∂ u ∂ v \frac{\partial u}{\partial v} vu
  • 标量对向量( n n n 维向量)的偏导数 : ∂ u ∂ v → = ( ∂ u ∂ v 1 , ∂ u ∂ v 2 , ⋯   , ∂ u ∂ v n ) T \frac{\partial u}{\partial \overrightarrow{\mathbf{v}}}=\left(\frac{\partial u}{\partial v_{1}}, \frac{\partial u}{\partial v_{2}}, \cdots, \frac{\partial u}{\partial v_{n}}\right)^{T} v u=(v1u,v2u,,vnu)T
  • 标量对矩阵( m × n m \times n m×n阶矩阵)的偏导数:

∂ u ∂ V = [ ∂ u ∂ V 1 , 1 ∂ u ∂ V 1 , 2 ⋯ ∂ u ∂ V 1 , n ∂ u ∂ V 2 , 1 ∂ u ∂ V 2 , 2 ⋯ ∂ u ∂ V 2 , n ⋮ ⋮ ⋱ ⋮ ∂ u ∂ V m , 1 ∂ u ∂ V m , 2 ⋯ ∂ u ∂ V m , n ] \frac{\partial u}{\partial \mathbf{V}}=\left[\begin{array}{cccc} \frac{\partial u}{\partial V_{1,1}} & \frac{\partial u}{\partial V_{1,2}} & \cdots & \frac{\partial u}{\partial V_{1, n}} \\ \frac{\partial u}{\partial V_{2,1}} & \frac{\partial u}{\partial V_{2,2}} & \cdots & \frac{\partial u}{\partial V_{2, n}} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{\partial u}{\partial V_{m, 1}} & \frac{\partial u}{\partial V_{m, 2}} & \cdots & \frac{\partial u}{\partial V_{m, n}} \end{array}\right] Vu=V1,1uV2,1uVm,1uV1,2uV2,2uVm,2uV1,nuV2,nuVm,nu

  • 向量( m m m维向量)对标量的偏导数: ∂ u → ∂ v = ( ∂ u 1 ∂ v , ∂ u 2 ∂ v , ⋯   , ∂ u m ∂ v ) T \frac{\partial \overrightarrow{\mathbf{u}}}{\partial v}=\left(\frac{\partial u_{1}}{\partial v}, \frac{\partial u_{2}}{\partial v}, \cdots, \frac{\partial u_{m}}{\partial v}\right)^{T} vu =(vu1,vu2,,vum)T
  • 向量( m m m维向量)对向量 ( n n n维向量) 的偏导数(雅可比矩阵,行优先)如果为列优先,则为矩阵的转置。

∂ u → ∂ v → = [ ∂ u 1 ∂ v 1 ∂ u 1 ∂ v 2 ⋯ ∂ u 1 ∂ v n ∂ u 2 ∂ v 1 ∂ u 2 ∂ v 2 ⋯ ∂ u 2 ∂ v n ⋮ ⋮ ⋱ ⋮ ∂ u m ∂ v 1 ∂ u m ∂ v 2 ⋯ ∂ u m ∂ v n ] \frac{\partial \overrightarrow{\mathbf{u}}}{\partial \overrightarrow{\mathbf{v}}}=\left[\begin{array}{cccc} \frac{\partial u_{1}}{\partial v_{1}} & \frac{\partial u_{1}}{\partial v_{2}} & \cdots & \frac{\partial u_{1}}{\partial v_{n}} \\ \frac{\partial u_{2}}{\partial v_{1}} & \frac{\partial u_{2}}{\partial v_{2}} & \cdots & \frac{\partial u_{2}}{\partial v_{n}} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{\partial u_{m}}{\partial v_{1}} & \frac{\partial u_{m}}{\partial v_{2}} & \cdots & \frac{\partial u_{m}}{\partial v_{n}} \end{array}\right] v u =v1u1v1u2v1umv2u1v2u2v2umvnu1vnu2vnum

  • 矩阵( m × n m \times n m×n阶矩阵)对标量的偏导数

∂ U ∂ v = [ ∂ U 1 , 1 ∂ v ∂ U 1 , 2 ∂ v ⋯ ∂ U 1 , n ∂ v ∂ U 2 , 1 ∂ v ∂ U 2 , 2 ∂ v ⋯ ∂ U 2 , n ∂ v ⋮ ⋮ ⋱ ⋮ ∂ U m , 1 ∂ v ∂ U m , 2 ∂ v ⋯ ∂ U m , n ∂ v ] \frac{\partial \mathbf{U}}{\partial v}=\left[\begin{array}{cccc} \frac{\partial U_{1,1}}{\partial v} & \frac{\partial U_{1,2}}{\partial v} & \cdots & \frac{\partial U_{1, n}}{\partial v} \\ \frac{\partial U_{2,1}}{\partial v} & \frac{\partial U_{2,2}}{\partial v} & \cdots & \frac{\partial U_{2, n}}{\partial v} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{\partial U_{m, 1}}{\partial v} & \frac{\partial U_{m, 2}}{\partial v} & \cdots & \frac{\partial U_{m, n}}{\partial v} \end{array}\right] vU=vU1,1vU2,1vUm,1vU1,2vU2,2vUm,2vU1,nvU2,nvUm,n

参考资料

  • http://www.huaxiaozhuan.com/%E6%95%B0%E5%AD%A6%E5%9F%BA%E7%A1%80/chapters/1_algebra.html

  • https://blog.csdn.net/zaishuiyifangxym/article/details/81673491

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值