Nabla Operator

这篇个人笔记详细解释了Nabla操作符,包括索引符号、定义、导数等概念。Nabla用于表示梯度、散度和旋度,并通过矩阵形式展示向量函数的雅可比矩阵。此外,还介绍了标量函数和向量函数的导数表达式。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

This is a personal note with personal understanding.

Notations

Index Notation is included.

Scalars are not bold ( a , b , c ⋯ a,b,c\cdots a,b,c), while vectors are bold ( a , b , c ⋯ \boldsymbol{a,b,c\cdots} a,b,c)

Sometimes for convenience,
∂ x = ∂ ∂ x ∂ y = ∂ ∂ y ∂ z = ∂ ∂ z \partial_x = \frac{\partial}{\partial x} \qquad\partial_y = \frac{\partial}{\partial y} \qquad\partial_z = \frac{\partial}{\partial z} x=xy=yz=z

Row vector inner product a ⋅ b = a b T \newcommand{\V}[1]{\boldsymbol{#1}}\V{a}\cdot\V{b} = \V{a}\V{b}^T ab=abT

Definitions

Operators and functions can be expressed as a row vector.
∇ = ( ∂ ∂ x , ∂ ∂ y , ∂ ∂ z ) \nabla = \Big(\frac{\partial}{\partial x}, \frac{\partial}{\partial y}, \frac{\partial}{\partial z}\Big) =(x,y,z)

Gradient of a scalar function is a vector
grad f = ∇ f = ( ∂ ∂ x , ∂ ∂ y , ∂ ∂ z ) f = ( ∂ f ∂ x , ∂ f ∂ y , ∂ f ∂ z ) \text{grad} f = \nabla f = \Big(\frac{\partial}{\partial x}, \frac{\partial}{\partial y}, \frac{\partial}{\partial z}\Big)f = \Big(\frac{\partial f}{\partial x}, \frac{\partial f}{\partial y}, \frac{\partial f}{\partial z}\Big) gradf=f=(x,y,z)f=(xf,yf,zf)

Divergence of a vector function is a scalar
div F = ∇ ⋅ F = ∂ F i ∂ x i = ∂ F x ∂ x + ∂ F y ∂ y + ∂ F z ∂ z \newcommand{\V}[1]{\boldsymbol{#1}} \text{div}\V{F} = \nabla\cdot\V{F} = \frac{\partial F_i}{\partial x_i} = \frac{\partial F_x}{\partial x} + \frac{\partial F_y}{\partial y} + \frac{\partial F_z}{\partial z} divF=F=xiFi=xFx+yFy+zFz

Curl of a vector function is a vector
curl F = ∇ × F = det ⁡ ∣ e ^ 1 e ^ 2 e ^ 3 ∂ x ∂ y ∂ z F x F y F z ∣ \newcommand{\V}[1]{\boldsymbol{#1}} \text{curl}\V{F} = \nabla\times\V{F} = \det\left|\begin{matrix}\hat{e}_1 & \hat{e}_2 & \hat{e}_3 \\ \partial_x & \partial_y & \partial_z \\ F_x & F_y & F_z \end{matrix}\right| curlF=×F=dete^1xFxe^2yFye^3zFz

Jacobian Matrix of a vector function is a 2d matrix
J F = D F = ∇ F = ( ∂ x F x ∂ y F x ∂ z F x ∂ x F y ∂ y F y ∂ z F y ∂ x F z ∂ y F z ∂ z F z ) \newcommand{\V}[1]{\boldsymbol{#1}} \V{J}_\V{F} = D\V{F} = \nabla\V{F} = \left(\begin{matrix}\partial_x F_x & \partial_y F_x & \partial_z F_x \\ \partial_x F_y & \partial_y F_y & \partial_z F_y\\ \partial_x F_z & \partial_y F_z & \partial_z F_z \end{matrix}\right) JF=DF=F=xFxxFyxFzyFxyFyyFzzFxzFyzFz

Derivative

Infinitesimal changes form column vectors.

For a scalar function f f f, total derivative is
d f = ( ∇ f ) d r = ( ∂ f ∂ x , ∂ f ∂ y , ∂ f ∂ z ) [ d x d y d z ] \newcommand{\V}[1]{\boldsymbol{#1}} df = (\nabla f)d\V{r} = \Big(\frac{\partial f}{\partial x}, \frac{\partial f}{\partial y}, \frac{\partial f}{\partial z}\Big)\left[\begin{aligned}dx \\ dy\\ dz\end{aligned}\right] df=(f)dr=(xf,yf,zf)dxdydz

For a vector function F \boldsymbol{F} F
d F = ( ∇ F ) d r = ( ∂ x F x ∂ y F x ∂ z F x ∂ x F y ∂ y F y ∂ z F y ∂ x F z ∂ y F z ∂ z F z ) [ d x d y d z ] \newcommand{\V}[1]{\boldsymbol{#1}} d\V{F} = (\nabla \V{F})d\V{r} = \left(\begin{matrix}\partial_x F_x & \partial_y F_x & \partial_z F_x \\ \partial_x F_y & \partial_y F_y & \partial_z F_y\\ \partial_x F_z & \partial_y F_z & \partial_z F_z \end{matrix}\right)\left[\begin{aligned}dx \\ dy\\ dz\end{aligned}\right] dF=(F)dr=xFxxFyxFzyFxyFyyFzzFxzFyzFzdxdydz

This can be applied to gradient of dot product.
∇ ( A ⋅ B ) = A ⋅ ( ∇ B ) + B ⋅ ( ∇ A ) \newcommand{\V}[1]{\boldsymbol{#1}} \nabla(\V{A}\cdot\V{B}) = \V{A}\cdot(\nabla\V{B}) + \V{B}\cdot(\nabla\V{A}) (AB)=A(B)+B(A)

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值