Next Generation of Neural networks

1. What is wrong with back-propagation

  • It requires labeled training data. -Almost data is unlabeled
  • Unless the weights are highly redundant, labels cannot possibly provide enough information
  • The learning time does not scale well. -It’s very slow in networks with multiple hidden layers
  • The neurons need to send two different types of signal (Forward pass: activation = a | Backward pass: δ=Costz )

2. How to overcome the limitations of back-propagation

One promising approach is that we need to keep the efficiency of using a gradient method for adjusting the weights, but also use it for modeling the structure of the sensory input at the same time.

Adjust the weights to maximize the probability that a generative model would have produced the sensory input. Try to learn p(image) not p(label | image)


Weights Energies Probabilites

Each possible joint configuration of the visible and hidden units has a Hopfield “energy”. The energy is determined by the weights and biases.

The energy of a joint configuration of the visible and
hidden units determines the probability that the network
will choose that configuration.
RestrcitedBoltzmannMachine

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值