Machine learning(3): Bayesian Learning(1)

Conditional Probability

在这里插入图片描述

Bayes Theorem

在这里插入图片描述

The conditional probability in independence

在这里插入图片描述
Especially,
在这里插入图片描述

Conditional independence

在这里插入图片描述

Bayesian Learning

Estimating probabilities is essentially a matter of counting the
occurrences of particular combinations of values in the
training data set.
在这里插入图片描述
That is the basic estimating probabilities.
But if we have low conts, our probability is closed to 0. So, we could find a way to solve it .
在这里插入图片描述

Complete Bayes Classifiers

在这里插入图片描述
But using the complete bayes has a lot examples to support.
We could estimate some examples:
在这里插入图片描述
So, we could see if we want to use some sinple examples, their probabilities of examples are up to bigger numbers.
This method is no useful for all situtations.

Naive Bayes Classifiers

The complete Bayes classifier is impractical because so much
data is required to estimate the conditional probabilities.
Can we get round this problem by finding a much more
economical way to estimate them.
在这里插入图片描述
You could see that. It just need shorter number of examples to complete the prediction.

NUMERIC ATTRIBUTES

Two type of solution:
<1> Discretization
在这里插入图片描述
<2> Assume a distribution
在这里插入图片描述

BAYESIAN BELIEF NETWORKS

Build a model with two conditions:
<1> Specifies which conditional independence assumptions
are valid.
<2> Provides sets of conditional probabilities to specify the
joint probability distributions wherever dependencies
exist.
在这里插入图片描述

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值