自定义博客皮肤VIP专享

*博客头图:

格式为PNG、JPG,宽度*高度大于1920*100像素,不超过2MB,主视觉建议放在右侧,请参照线上博客头图

请上传大于1920*100像素的图片!

博客底图:

图片格式为PNG、JPG,不超过1MB,可上下左右平铺至整个背景

栏目图:

图片格式为PNG、JPG,图片宽度*高度为300*38像素,不超过0.5MB

主标题颜色:

RGB颜色,例如:#AFAFAF

Hover:

RGB颜色,例如:#AFAFAF

副标题颜色:

RGB颜色,例如:#AFAFAF

自定义博客皮肤

-+
  • 博客(0)
  • 资源 (13)
  • 收藏
  • 关注

空空如也

ReSume学习规则详解polluck博士论文10.1.1.95.5982.pdf

类脑计算经典论文

2021-11-02

Geometric Deep Learning Grids, Groups, Graphs, Geodesics, and Gauges.pdf

几何深度学习

2021-05-08

neuronal-growth-cones.pdf

脉冲神经网络的生物学机制,对了解生物神经机制有很大好处

2021-04-02

coherent-behavior-in-neuronal-networks_compress.pdf

脉冲神经网络

2021-04-01

卷积神经网络转到脉冲神经网络matlab代码.zip

不会脉冲神经网络没关系,这个matlab程序将卷积神经网络转换为脉冲神经网络

2021-03-23

SpiNNaker Design and Implementation.pdf

脉冲神经网络经典论文

2021-03-23

Algorithmic.Thinking.2020.11.pdf

算法详述

2021-01-18

多层卷积脉冲神经网络.pdf

Spiking neural networks (SNNs) have advantages over traditional, non-spiking networks with respect to bio- realism, potential for low-power hardware implementations, and theoretical computing power. However, in practice, spiking net- works with multi-layer learning have proven difficult to train. This paper explores a novel, bio-inspired spiking convolutional neural network (CNN) that is trained in a greedy, layer-wise fashion. The spiking CNN consists of a convolutional/pooling layer followed by a feature discovery layer, both of which undergo bio-inspired learning. Kernels for the convolutional layer are trained using a sparse, spiking auto-encoder representing primary visual features. The feature discovery layer uses a probabilistic spike-timing-dependent plasticity (STDP) learning rule. This layer represents complex visual features using WTA- thresholded, leaky, integrate-and-fire (LIF) neurons. The new model is evaluated on the MNIST digit dataset using clean and noisy images. Intermediate results show that the convolutional layer is stack-admissible, enabling it to support a multi-layer learning architecture. The recognition performance for clean images is above 98%. This performance is accounted for by the independent and informative visual features extracted in a hierarchy of convolutional and feature discovery layers. The performance loss for recognizing the noisy images is in the range 0.1% to 8.5%. This level of performance loss indicates that the network is robust to additive noise

2020-07-27

逼近精度梯度误差界 for structured convex optimization.pdf

Convex optimization problems arising in applications, possibly as approx- imations of intractable problems, are often structured and large scale. When the data are noisy, it is of interest to bound the solution error relative to the (unknown) solu- tion of the original noiseless problem. Related to this is an error bound for the lin- ear convergence analysis of first-order gradient methods for solving these problems. Example applications include compressed sensing, variable selection in regression, TV-regularized image denoising, and sensor network localization

2020-07-27

元学习 概要.pdf

Meta-learning, or learning to learn, is the science of systematically observing how different machine learning approaches perform on a wide range of learning tasks, and then learning from this experience, or meta-data, to learn new tasks much faster than otherwise possible. Not only does this dramatically speed up and improve the design of machine learning pipelines or neural architectures, it also allows us to replace hand-engineered algorithms with novel approaches learned in a data-driven way. In this chapter, we provide an overview of the state of the art in this fascinating and continuously evolving field.

2020-07-27

pytorch教程ppt.rar

Sung Kim的pytorch与机器学习结合的完整教程,详细,简明,知识点讲解的很好,PPT格式也方便学习

2020-03-24

胶囊图神经网络20190823.pdf

将胶囊神经网络与图论结合起来的一篇文章,看看别人都是怎么干的

2019-08-23

Deep Learning – Past, Present, and Future.pdf

非常棒的综述,讲述了人工智能、深度学习的过去,现在,并对未来进行了展望

2019-08-23

空空如也

TA创建的收藏夹 TA关注的收藏夹

TA关注的人

提示
确定要删除当前文章?
取消 删除