一二阶统计量的在线增量计算算法原理推导

一二阶统计量的在线增量计算算法原理推导

统计机器学习领域中经常用到诸如均值,方差,标准差,协方差等统计量;这些统计量频繁计算需要消耗很大的内存,且非常耗时。现有一种常用的在线,增量的统计量的计算方法,可以有效地解决计算大批量数据的统计量时,内存消耗大,耗时较长的问题。

通过算法原理的推导,可以建立统计量的当前值 M , k M_{,k} M,k与前一个统计量的值 M , k − 1 M_{,k-1} M,k1之间的关系( k k k表示待统计的序列的第k个元素),最终实现在线增量的统计量计算算法。

1. 均值 mean

x ˉ = ∑ i = 1 n x i \bar x=\displaystyle \sum_{i=1}^n x_i xˉ=i=1nxi

2. 方差 variance

s 2 = ∑ i = 1 n ( x i − x ˉ ) 2 n − 1 = 1 n ( n − 1 ) [ n ∑ i = 1 n x i 2 − ( ∑ i = 1 n x i ) 2 ] s^2=\displaystyle \frac{\displaystyle \sum_{i=1}^n (x_i-\bar x)^2}{n-1}= \frac{1}{n(n-1)} \Big[n \displaystyle \sum_{i=1}^n x_i^2 -(\displaystyle \sum_{i=1}^n x_i)^2\Big] s2=n1i=1n(xixˉ)2=n(n1)1[ni=1nxi2(i=1nxi)2]

推导公式1:

s 2 = ∑ ( x i − x ˉ ) 2 n − 1 s^2=\displaystyle \frac{\sum (x_i-\bar x)^2}{n-1} s2=n1(xixˉ)2 = ∑ x i 2 − 2 x ˉ ∑ x i + ∑ x ˉ 2 n − 1 =\displaystyle \frac{\sum x_i^2 -2\bar x \sum x_i + \sum \bar x^2}{n-1} =n1xi22xˉxi+xˉ2

= ∑ x i 2 − 2 x ˉ ⋅ n x ˉ + n x ˉ 2 n − 1 =\displaystyle \frac{\sum x_i^2 -2 \bar x \cdot n\bar x + n\bar x^2}{n-1} =n1xi22xˉnxˉ+nxˉ2 = ∑ x i 2 − n x ˉ 2 n − 1 =\displaystyle \frac{\sum x_i^2 - n\bar x^2}{n-1} =n1xi2nxˉ2

= ∑ i = 1 n x i 2 − 1 n ( ∑ i = 1 n x i 2 ) n − 1 =\displaystyle \frac{\displaystyle \sum_{i=1}^n x_i^2 - \frac{1}{n} \Big(\displaystyle \sum_{i=1}^n x_i^2\Big)}{n-1} =n1i=1nxi2n1(i=1nxi2) = 1 n ( n − 1 ) [ n ∑ x i 2 − ( ∑ x i ) 2 ] =\displaystyle \frac{1}{n(n-1)} \Big[n \displaystyle \sum x_i^2 -(\displaystyle \sum x_i)^2\Big] =n(n1)1[nxi2(xi)2]

3. 标准差 standard deviation

s = ∑ i = 1 n ( x i − x ˉ ) 2 n − 1 s=\displaystyle \sqrt \frac{\displaystyle \sum_{i=1}^n (x_i-\bar x)^2}{n-1} s=n1i=1n(xixˉ)2

4. 协方差 covariance

c o v ( x , y ) = ∑ i = 1 n ( x i − x ˉ ) ( y i − y ˉ ) n − 1 cov(x,y)=\displaystyle \frac{\displaystyle \sum_{i=1}^n (x_i-\bar x)(y_i-\bar y)}{n-1} cov(x,y)=n1i=1n(xixˉ)(yiyˉ)

推导公式2:

( n − 1 ) c o v ( x , y ) = ∑ ( x i y i − x ˉ y i − x i y ˉ + x ˉ y ˉ ) (n-1)cov(x,y)=\sum(x_iy_i - \bar xy_i - x_i\bar y + \bar x \bar y) (n1)cov(x,y)=(xiyixˉyixiyˉ+xˉyˉ) = ∑ ( x i y i ) − ∑ ( x ˉ y i ) − ∑ ( x i y ˉ ) + ∑ ( x ˉ y ˉ ) =\sum(x_iy_i) - \sum(\bar xy_i) - \sum(x_i\bar y) + \sum(\bar x \bar y) =(xiyi)(xˉyi)(xiyˉ)+(xˉyˉ)

= ∑ ( x i y i ) − n x ˉ y ˉ − n x ˉ y ˉ + n x ˉ y ˉ =\sum(x_iy_i) - n\bar x \bar y - n \bar x \bar y + n \bar x \bar y =(xiyi)nxˉyˉnxˉyˉ+nxˉyˉ = ∑ ( x i y i ) − n x ˉ y ˉ =\sum(x_iy_i) - n\bar x \bar y =(xiyi)nxˉyˉ

= ∑ ( x i y i − x ˉ y i ) =\sum(x_iy_i - \bar x y_i) =(xiyixˉyi) = ∑ i = 1 n y i ( x i − x ˉ ) =\displaystyle \sum_{i=1}^n y_i(x_i- \bar x) =i=1nyi(xixˉ)

则, c o v ( x , y ) = 1 n − 1 ∑ i = 1 n x i ( y i − y ˉ ) = 1 n − 1 ∑ i = 1 n y i ( x i − x ˉ ) cov(x,y) = \frac{1}{n-1}\displaystyle \sum_{i=1}^n x_i(y_i- \bar y) = \frac{1}{n-1}\displaystyle \sum_{i=1}^n y_i(x_i- \bar x) cov(x,y)=n11i=1nxi(yiyˉ)=n11i=1nyi(xixˉ)

M1为一阶累积统计量: M 1 = ∑ i = 1 k x i \displaystyle M_1 = \sum_{i=1}^k{x_i} M1=i=1kxi

M2为一阶累积统计量: M 2 = ∑ i = 1 k ( x i − x ˉ ) 2 \displaystyle M_2 = \sum_{i=1}^k{(x_i - \bar x)}^2 M2=i=1k(xixˉ)2

5. 一二阶统计量的在线增量算法-结论

M 1 , k = M 1 , k − 1 + ( x k − M k − 1 ) / k M_{1,k} = M_{1,k-1} + (x_k - M_{k-1})/k M1,k=M1,k1+(xkMk1)/k

M 2 , k = M 2 , k − 1 + ( x k − M k − 1 ) ∗ ( x k − M k ) = M 2 , k − 1 + ( 1 − 1 k ) ( x k − M 1 , k − 1 ) 2 \displaystyle M_{2,k} = M_{2,k-1} + (x_k - M_{k-1})*(x_k - M_k) = M_{2,k-1} + (1-\frac{1}{k})(x_k-M_{1,k-1})^2 M2,k=M2,k1+(xkMk1)(xkMk)=M2,k1+(1k1)(xkM1,k1)2

初始条件: M 1 = x 1 , M_1 = x_1, M1=x1, M 2 = 0 M_2=0 M2=0

推导过程1:

M 1 , k = ∑ i = 1 k x i = 1 k ( ∑ i = 1 k x i + x k ) = 1 k ( k − 1 k − 1 ∑ i = 1 k − 1 x i + x k ) \displaystyle M_{1,k} = \sum_{i=1}^k{x_i}=\frac{1}{k} (\sum_{i=1}^k{x_i} + x_k) = \frac{1}{k}(\frac{k-1}{k-1}\sum_{i=1}^{k-1}{x_i} + x_k) M1,k=i=1kxi=k1(i=1kxi+xk)=k1(k1k1i=1k1xi+xk)

= 1 k ( ( k − 1 ) M 1 , k − 1 + x k ) = k − 1 k M 1 , k − 1 + 1 k x k \displaystyle =\frac{1}{k}((k-1)M_{1,k-1}+x_k) =\frac{k-1}{k}M_{1,k-1}+ \frac{1}{k}x_k =k1((k1)M1,k1+xk)=kk1M1,k1+k1xk

= M 1 , k − 1 − 1 k M 1 , k − 1 + 1 k x k = M 1 , k − 1 + ( x k − M 1 , k − 1 ) / k \displaystyle =M_{1,k-1} - \frac{1}{k}M_{1,k-1}+ \frac{1}{k}x_k=M_{1,k-1} + (x_k - M_{1,k-1})/k =M1,k1k1M1,k1+k1xk=M1,k1+(xkM1,k1)/k

准备两个推导的前提等式:
x k − M 1 , k − 1 = k ( M 1 , k − M 1 , k − 1 ) x_k-M_{1,k-1} = k(M_{1,k} - M_{1,k-1}) xkM1,k1=k(M1,kM1,k1)

m 1 , k − 1 = ( m 1 , k − 1 k x k ) 1 1 − 1 k \displaystyle m_{1,k-1}=(m_{1,k}-\frac{1}{k}x_k)\frac{1}{1-\frac{1}{k}} m1,k1=(m1,kk1xk)1k11

推导过程2:

M 2 , k = ∑ i = 1 k ( x i − M 1 , k ) 2 = ∑ i = 1 k ( x i − M 1 , k − 1 − ( x k − M 1 , k − 1 ) / k ) 2 \displaystyle M_{2,k} = \sum_{i=1}^k{(x_i - M_{1,k})}^2=\sum_{i=1}^k(x_i-M_{1,k-1} - (x_k - M_{1,k-1})/k)^2 M2,k=i=1k(xiM1,k)2=i=1k(xiM1,k1(xkM1,k1)/k)2

= [ ( x i − M 1 , k − 1 ) 2 + 1 k 2 ( x k − M 1 , k − 1 ) 2 − 2 ( x i − M 1 , k − 1 ) 1 k ( x k − M 1 , k − 1 ) ] \displaystyle=[(x_i-M_{1,k-1})^2+\frac{1}{k^2}(x_k-M_{1,k-1})^2-2(x_i-M_{1,k-1})\frac{1}{k}(x_k-M_{1,k-1})] =[(xiM1,k1)2+k21(xkM1,k1)22(xiM1,k1)k1(xkM1,k1)]

= ∑ i = 1 k ( x i − M 1 , k − 1 ) 2 + ∑ i = 1 k 1 k 2 ( x k − M 1 , k − 1 ) 2 − 2 k ∑ i = 1 k ( x i − M 1 , k − 1 ) ( x k − M 1 , k − 1 ) \displaystyle= \sum_{i=1}^k(x_i-M_{1,k-1})^2+ \sum_{i=1}^k\frac{1}{k^2}(x_k-M_{1,k-1})^2 - \frac{2}{k} \sum_{i=1}^k{(x_i-M_{1,k-1})(x_k-M_{1,k-1})} =i=1k(xiM1,k1)2+i=1kk21(xkM1,k1)2k2i=1k(xiM1,k1)(xkM1,k1)

= ∑ i = 1 k − 1 ( x i − M 1 , k − 1 ) 2 + ( x k − M 1 , k − 1 ) 2 + 1 k ( x k − M 1 , k − 1 ) 2 − 2 k ( k M 1 , k − k M 1 , k − 1 ) ( x k − M 1 , k − 1 ) \displaystyle= \sum_{i=1}^{k-1}{(x_i-M_{1,k-1})^2} +(x_k-M_{1,k-1})^2+ \frac{1}{k}(x_k-M_{1,k-1})^2 - \frac{2}{k}{(kM_{1,k} - k M_{1,k-1})(x_k-M_{1,k-1})} =i=1k1(xiM1,k1)2+(xkM1,k1)2+k1(xkM1,k1)2k2(kM1,kkM1,k1)(xkM1,k1)

= M 2 , k − 1 + ( x k − M 1 , k − 1 ) 2 + 1 k ( x k − M 1 , k − 1 ) 2 − 2 ( M 1 , k − M 1 , k − 1 ) ( x k − M 1 , k − 1 ) \displaystyle= M_{2,k-1} + (x_k-M_{1,k-1})^2 + \frac{1}{k}(x_k-M_{1,k-1})^2 - 2(M_{1,k} - M_{1,k-1})(x_k-M_{1,k-1}) =M2,k1+(xkM1,k1)2+k1(xkM1,k1)22(M1,kM1,k1)(xkM1,k1)

= M 2 , k − 1 + ( x k − M 1 , k − 1 ) 2 + 1 k ( x k − M 1 , k − 1 ) 2 − 2 1 k ( x k − M 1 , k − 1 ) ( x k − M 1 , k − 1 ) \displaystyle= M_{2,k-1} + (x_k-M_{1,k-1})^2 + \frac{1}{k}(x_k-M_{1,k-1})^2 - 2\frac{1}{k}(x_k - M_{1,k-1})(x_k-M_{1,k-1}) =M2,k1+(xkM1,k1)2+k1(xkM1,k1)22k1(xkM1,k1)(xkM1,k1)

= M 2 , k − 1 + ( 1 − 1 k ) ( x k − M 1 , k − 1 ) 2 \displaystyle= M_{2,k-1} + (1-\frac{1}{k})(x_k-M_{1,k-1})^2 =M2,k1+(1k1)(xkM1,k1)2

= M 2 , k − 1 + ( x k − M 1 , k − 1 ) ( 1 − 1 k ) ( x k − M 1 , k − 1 ) \displaystyle= M_{2,k-1} + (x_k-M_{1,k-1})(1-\frac{1}{k})(x_k-M_{1,k-1}) =M2,k1+(xkM1,k1)(1k1)(xkM1,k1)

= M 2 , k − 1 + ( x k − M 1 , k − 1 ) [ x k − 1 k x k − ( 1 − 1 k ) M 1 , k − 1 ] \displaystyle= M_{2,k-1} + (x_k-M_{1,k-1})\Big[x_k - \frac{1}{k} x_k - (1-\frac{1}{k})M_{1,k-1}\Big] =M2,k1+(xkM1,k1)[xkk1xk(1k1)M1,k1]

= M 2 , k − 1 + ( x k − M 1 , k − 1 ) [ x k − 1 k x k − ( M 1 , k − 1 k x k ) ] \displaystyle= M_{2,k-1} + (x_k-M_{1,k-1})\Big[x_k - \frac{1}{k} x_k - (M_{1,k}-\frac{1}{k}x_k)\Big] =M2,k1+(xkM1,k1)[xkk1xk(M1,kk1xk)]

= M 2 , k − 1 + ( x k − M 1 , k − 1 ) ( x k − M 1 , k ) \displaystyle= M_{2,k-1} + (x_k-M_{1,k-1})(x_k - M_{1,k}) =M2,k1+(xkM1,k1)(xkM1,k)

参考资料

  1. Skewness - WikiPedia
  1. Joanes D N, Gill C A. Comparing measures of sample skewness and kurtosis[J]. Journal of the Royal Statistical Society: Series D (The Statistician), 1998, 47(1): 183-189.
  1. binti Yusoff S, Wah Y B. Comparison of conventional measures of skewness and kurtosis for small sample size[C]//2012 International Conference on Statistics in Science, Business and Engineering (ICSSBE). IEEE, 2012: 1-6.
  1. Pebay P P. Formulas for robust, one-pass parallel computation of covariances and arbitrary-order statistical moments[R]. Sandia National Laboratories, 2008.
  1. Online skewness kurtosis computing
  1. Online linear regression computing
  • 1
    点赞
  • 5
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值