Fisher Information / Expected Information: Definition

What is Fisher Information?

Fisher information tells us how much information about an unknown parameter we can get from a sample. In other words, it tells us how well we can measure a parameter, given a certain amount of data. More formally, it measures the expected amount of information given by a random variable (X) for a parameter(Θ) of interest. The concept is related to the law of entropy, as both are ways to measure disorder in a system (Friedan, 1998).

Uses include:

Finding the Fisher Information

Finding the expected amount of information requires calculus. Specifically, a good understanding of differential equations is required if you want to derive information for a system.

Three different ways can calculate the amount of information contained in a random variable X:

  1. fisher-information
     
  2. This can be rewritten (if you change the order of integration and differentiation) as:
    fisher-2
     
  3. Or, put another way:
    fisher-3
     

The bottom equation is usually the most practical. However, you may not have to use calculus, because expected information has been calculated for a wide number of distributions already. For example:

  • Ly et.al (and many others) state that the expected amount of information in a Bernoulli distribution is:
    I(Θ) = 1 / Θ (1 – Θ).
  • For mixture distributions, trying to find information can “become quite difficult” (Wallis, 2005). If you have a mixture model, Wallis’s book Statistical and Inductive Inference by Minimum Message Length gives an excellent rundown on the problems you might expect.

If you’re trying to find expected information, try an Internet or scholarly database search first: the solution for many common distributions (and many uncommon ones) is probably out there.

Example

Find the fisher information for X ~ N(μ, σ2). The parameter, μ, is unknown.
Solution:
For −∞ < x < ∞:
fisher information 1


First and second derivatives are:
example-2


So the Fisher Information is:
example-3
 

Other Uses

Fisher information is used for slightly different purposes in Bayesian statistics and Minimum Description Length(MDL):

 

  1. Bayesian Statistics: finds a default prior for a parameter.
  2. Minimum description length (MDL): measures complexity for different models.

 

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值