1 Random Variable
1.1 Definition:
- A random variable R on a probability space is a total function whose domain is the sample space.
- Notice that the name “random variable” is a misnomer, random variables are actually functions.
1.2 Indicator Random Variables
- An indicator random variable is a random variable that maps every outcome to either 0 or 1. Indicator random variables are also called Bernoulli variables.
1.3 Random Variable and Events
- Any assertion about the values of random variables defines an event.
2 Independence
Definition:
- Random variables R1 and R2 are independent iff for all x1, x2, the two events
[R1=x1] = [R2=x2]
are independent. - Two events are independent iff their indicator variables are independent.
- Random variables
R1,R2,...,Rn
are mutually independent iff for all
x1,x2...xn
, the
n
events:
[R1=x1],[R2=x2],...,[Rn=xn]
are mutuall independent. They are k-way independent iff every subset of k of them are mutually independent.
3 Distribute Functions
Definition:
- Let
R
be a random variable with codomain
V . The probability density function of R is a function
PDFR:V→[0,1] defined by:
PDFR(x)::={Pr[R=x]if x ∈ range(R)0if x ∉ range(R) - The cumulative distribution function is the function
CDFR:R→[0,1]
defined by:
CDFR(x)::=Pr[R≤x]
Common Distributions:
- Bernoulli Distributions
- Uniform Distributions
- Binomial Distributions
- general binomial density function
- unbiased binomial distribution
Reference
[1] Lehman E, Leighton F H, Meyer A R. Mathematics for Computer Science[J]. 2015.