定义
with m > 0 m > 0 m>0:
Markov’s inequality
X
⩾
0
,
Pr
(
X
>
m
)
⩽
E
(
X
)
m
X \geqslant 0,\quad \Pr(X > m) \leqslant \frac{\mathbb{E}(X)}{m}
X⩾0,Pr(X>m)⩽mE(X)
Chebyshev’s inequality
X
∈
R
,
Pr
(
∣
X
−
E
(
X
)
∣
>
m
)
⩽
V
a
r
(
X
)
m
2
X \in \mathbb{R},\quad \Pr(|X-\mathbb{E}(X)| > m) \leqslant \frac{\mathrm{Var}(X)}{m^2}
X∈R,Pr(∣X−E(X)∣>m)⩽m2Var(X)
one sided Chebyshev’s inequality
X
∈
R
,
Pr
(
(
X
−
E
(
X
)
)
>
m
)
⩽
V
a
r
(
X
)
V
a
r
(
X
)
+
m
2
X
∈
R
,
Pr
(
(
X
−
E
(
X
)
)
<
−
m
)
⩽
V
a
r
(
X
)
V
a
r
(
X
)
+
m
2
X \in \mathbb{R},\quad \Pr((X-\mathbb{E}(X)) > m) \leqslant \frac{\mathrm{Var}(X)}{\mathrm{Var}(X)+m^2} \\ X \in \mathbb{R},\quad \Pr((X-\mathbb{E}(X)) < -m) \leqslant \frac{\mathrm{Var}(X)}{\mathrm{Var}(X)+m^2} \\
X∈R,Pr((X−E(X))>m)⩽Var(X)+m2Var(X)X∈R,Pr((X−E(X))<−m)⩽Var(X)+m2Var(X)
分析
理论上, Chebyshev’s inequality 和 one sided Chebyshev’s inequality 都是 Markov’s inequality 的推论; 在一定条件下, Chebyshev’s inequality 和 one sided Chebyshev’s inequality 能够互相转化.
直觉上(不严谨的) “Markov’s inequality 强于 Chebyshev’s inequality 强于 one sided Chebyshev’s inequality”, 但是实际计算结果往往是“越强的不等式的界越松”.
A能推导出B(A比B强)但是A的界却比B的界松, 这看起来违背直觉, 但实际上是合理的, 因为A能推导出B的过程不是简单的“Pr(𝒳⩾𝓂)⩽A(𝓂)⩽B(𝓂)”, 而是通过对𝒳进行变换, 挤掉分布中的水分得到更紧的界, “Pr(𝒳⩾𝓂)⩽A(𝓂), Pr(φ(𝒳)⩾φ(𝓂))⩽A(φ(𝓂))=B(𝓂), 通过恰当选择φ实现A(φ(𝓂))=B(𝓂)⩽A(𝓂)”.
附录(mermaid流程图绘制)
%%{init: {'theme': 'base', 'themeVariables': { 'fontFamily': 'MS Serif', 'fontSize' : '12px'}}}%%
flowchart LR
subgraph corollary
direction TB
C[Chebyshev's inequality]
OC[one sided Chebyshev's inequality]
C -.->|"Var[𝒳] < 𝓂²<br/>Pr(𝒳 - 𝔼𝒳 < -𝓍) = Pr(𝒳 - 𝔼𝒳 > 𝓍)"| OC
OC -->|"Var[𝒳] > 𝓂²"| C
end
M[Markov's inequality]
M -->|"implies"| corollary