Discrete
Bernoulli distribution
- pmf
- f X ( x ) = P ( X = x ) = { ( 1 − p ) 1 − x p x for x = 0 or 1 0 otherwise f_X(x) = P(X= x) =\left\{\begin{aligned}(1-p)^{1-x}p^x & \quad \text{for x = 0 or 1}\\ 0 & \quad\text{otherwise}\end{aligned}\right. fX(x)=P(X=x)={ (1−p)1−xpx0for x = 0 or 1otherwise
- expectation
- E ( X ) = p E(X) = p E(X)=p
Binomial distribution
- pmf
- f X ( k ) = P ( X = k ) = { C n k p k ( 1 − p ) n − k for k=0,1,....,n 0 otherwise f_X(k) = P(X= k) =\left\{\begin{aligned}C_n^kp^k(1-p)^{n-k} & \quad \text{for k=0,1,....,n}\\ 0 & \quad\text{otherwise}\end{aligned}\right. fX(k)=P(X=k)={ Cnkpk(1−p)n−k0for k=0,1,....,notherwise
- expectation
- E ( X ) = n p E(X) = np E(X)=np
- variance
- v a r ( X ) = n p ( 1 − p ) var(X) = np(1-p) var(X)=np(1−p)
Geometric distribution
- pmf
- f X ( k ) = P ( X = k ) = { p ( 1 − p ) k − 1 for k=1,2,3... 0 otherwise f_X(k) = P(X= k) =\left\{\begin{aligned}p(1-p)^{k-1} & \quad \text{for k=1,2,3...}\\ 0 & \quad\text{otherwise}\end{aligned}\right. fX(k)=P(X=k)={ p(1−p)k−10for k=1,2,3...otherwise
- expectation
- E ( X ) = 1 P E(X) = \frac{1}{P} E(X)=P1
Negative binomial distribution
-
The negative binomial distribution arises as a generalization of the geometric distribution.
-
Suppose that a sequence of independent trials each with probability of success p p p is performed until there are r r r successes in all.
- so can be denote as p ⋅ C k − 1 r − 1 p r − 1 ( 1 − p ) ( k − 1 ) − ( r − 1 ) p \cdot C_{k-1}^{r-1} p^{r-1}(1-p)^{(k-1)-(r-1)} p⋅Ck−1r−1pr−1(1−p)(k−1)−(r−1)
-
pmf
- f X ( k ) =