人工智能资料库:第43辑(20170311)

作者:chen_h
微信号 & QQ:862251340
微信公众号:coderpai


  1. 【论文】DEEP REINFORCEMENT LEARNING: AN OVERVIEW

简介:

We give an overview of recent exciting achievements of deep reinforcement learning(RL). We start with background of deep learning and reinforcement learning, as well as introduction of testbeds. Next we discuss Deep Q-Network (DQN) and its extensions, asynchronous methods, policy optimization, reward, and planning. After that, we talk about attention and memory, unsupervised learning, and learning to learn. Then we discuss various applications of RL, including games, in particular, AlphaGo, robotics, spoken dialogue systems (a.k.a. chatbot), machine translation, text sequence prediction, neural architecture design, personalized web services, healthcare, finance, and music generation. We mention topics/papers not reviewed yet. After listing a collection of RL resources, we close with discussions.

原文链接:https://arxiv.org/pdf/1701.07274.pdf


2.【博客】Notes on the implementation DenseNet in tensorflow.

简介:

DenseNet(Densely Connected Convolutional Networks) is one of the latest neural networks for visual object recognition that obtains state of the art results on many datasets. It’s quite similar to ResNet but has some fundamental differences.

This post assumes previous knowledge of neural networks(NN) and convolutions(convs). Here I will not explain how NN or convs work, but mainly focus on two topics:

  • Why dense net differs from another convolution networks.
  • What difficulties I’ve met during the implementation of DenseNet in tensorflow.

原文链接:https://medium.com/@illarionkhlestov/notes-on-the-implementation-densenet-in-tensorflow-beeda9dd1504#.d9bfvf8ng


3.【代码】A Kitti Road Segmentation model implemented in tensorflow.

简介:

The model is designed to perform well on small datasets. The training is done using just 250 densely labelled images. Despite this a state-of-the art MaxF1 score of over 96% is achieved. The model is usable for real-time application. Inference can be performed at the impressive speed of 95ms per image.

The repository contains code for training, evaluating and visualizing semantic segmentation in TensorFlow. It is build to be compatible with the TensorVision back end which allows to organize experiments in a very clean way. Also check out KittiBox a similar projects to perform state-of-the art detection. And finally the MultiNet repository contains code to jointly train segmentation, classification and detection. KittiSeg and KittiBox are utilized as submodules in MultiNet.

原文链接:https://github.com/MarvinTeichmann/KittiSeg


4.【博客】Diversity and International Fellowships for Deep Learning Part 2

简介:

Applications are now open for Deep Learning Part 2, to be offered at the University of San Francisco Data Institute on Monday evenings, Feb 27-April 10. The course will cover integrating multiple cutting-edge deep learning techniques, as well as combining classic machine learning techniques with deep learning.

In part 1, we worked hard to curate a diverse group of participants, because we’d observed that artificial intelligence is missing out because of its lack of diversity. A study of 366 companies found that ethnically diverse companies are 35% more likely to perform well financially, and teams with more women perform better on collective intelligence tests. Scientific papers written by diverse teams receive more citations and have higher impact factors.

原文链接:http://www.fast.ai/2017/01/28/diversity-international-part-ii/


5.【博客】Building better neural networks

简介:

A group of professors and researchers at the Technical University of Berlin, the University of Vienna, and ETH Zurich have recently been working on understanding deep neural networks (computer systems that are modelled after the human brain) in “a mathematically sound way”, as Dr. Phillip Petersen refers to it. Although the official paper for this research, “Optimal Approximation with Sparse Deep Neural Networks,” will not be published until next week, Professor Gitta Kutyniok graciously presented a preview of their work for the Center’s Math & Data Seminar group this past Thursday.

原文链接:https://medium.com/@NYUDataScience/building-better-neural-networks-6346eeeebbb4#.4jh644usl


  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值