深度学习及计算机视觉相关资源(不断积累中)

LSTM:  http://colah.github.io/posts/2015-08-Understanding-LSTMs/ 
深度学习领域PyTorch项目-git源码整理 https://blog.csdn.net/u012969412/article/details/77479269?utm_source=blogxgwz0 
各种Pytorch项目 https://www.ritchieng.com/the-incredible-pytorch/ 

深度学习概述教程--Deep Learning Overview http://www.cnblogs.com/liuyihai/p/8321299.html 
深度学习深刻理解和应用--人工智能从业人员必看知识 https://www.cnblogs.com/liuyihai/p/8449058.html 
一天搞懂深度学习--李宏毅教程分享 http://www.cnblogs.com/liuyihai/p/8448977.html 

Artificial Inteligence https://leonardoaraujosantos.gitbooks.io/artificial-inteligence/ 

The Unreasonable Effectiveness of Recurrent Neural Networks http://karpathy.github.io/2015/05/21/rnn-effectiveness/ 

Pytorch资源大合集:https://github.com/bharathgs/Awesome-pytorch-list

cvpr 2018 image caption generation论文导读(含workshop): https://blog.csdn.net/m0_37052320/article/details/80947049

2019计算机视觉领域顶级会议时间表: https://blog.csdn.net/hitzijiyingcai/article/details/81709755

Image Captioning论文合辑: https://github.com/tangzhenyu/Image_Captioning_DL

The Annotated Transformer: http://nlp.seas.harvard.edu/2018/04/03/attention.html

【image caption】之任务图鉴:深度学习的图片描述生成方法集锦: https://blog.csdn.net/hanss2/article/details/80732318

吴恩达DeepLearning.ai《深度学习》课程笔记: https://blog.csdn.net/Koala_Tree/article/details/79913655

Cheatsheets for Stanford's CS 230 Deep Learning:https://github.com/afshinea/stanford-cs-230-deep-learning

Cheatsheets for Stanford's CS 229 Machine Learning: https://github.com/afshinea/stanford-cs-229-machine-learning

注意力的动画解析(以机器翻译为例): https://towardsdatascience.com/attn-illustrated-attention-5ec4ad276ee3

张量求导和计算图: https://mp.weixin.qq.com/s/HQpaAg00j-teybSQJAb-ZQ

 

TorchGAN: https://github.com/torchgan/torchgan

TorchText: https://github.com/pytorch/text

FastNLP: https://fastnlp.readthedocs.io/en/latest/

NLP-tutorial: https://github.com/graykode/nlp-tutorial

 

PyTorch Cookbook(常用代码段整理合集): https://zhuanlan.zhihu.com/p/59205847

PyTorch LSTM: https://towardsdatascience.com/taming-lstms-variable-sized-mini-batches-and-why-pytorch-is-good-for-your-health-61d35642972e

PyTorch Basics: https://medium.com/@aakashns/pytorch-basics-tensors-and-gradients-eb2f6e8a6eee

Text Summarization using Deep Learning: https://towardsdatascience.com/text-summarization-using-deep-learning-6e379ed2e89c

Convolutional Neural Networks from the ground up: https://towardsdatascience.com/convolutional-neural-networks-from-the-ground-up-c67bb41454e1

Attention in RNNs: https://medium.com/datadriveninvestor/attention-in-rnns-321fbcd64f05

Introduction to Computer Vision: https://medium.com/overture-ai/part-1-introduction-to-computer-vision-9a02a393d86d

Evolution of Natural Language Generation: https://medium.com/sfu-big-data/evolution-of-natural-language-generation-c5d7295d6517

The Real Reason behind all the Craze for Deep Learning: https://towardsdatascience.com/decoding-deep-learning-a-big-lie-or-the-next-big-thing-b924298f26d4

Everything About Python — Beginner To Advance: https://medium.com/fintechexplained/everything-about-python-from-beginner-to-advance-level-227d52ef32d2

Python Decorators: https://pouannes.github.io/blog/decorators/

10 Python Pandas tricks that make your work more efficient: https://towardsdatascience.com/10-python-pandas-tricks-that-make-your-work-more-efficient-2e8e483808ba

Gumbel-Softmax Trick和Gumbel分布: https://www.cnblogs.com/initial-h/p/9468974.html

Deep Learning Techniques Applied to Natural Language Processing: https://nlpoverview.com

Open Questions about Generative Adversarial Networks: https://distill.pub/2019/gan-open-problems/#advx

强化学习与序列生成: https://freeman.one/2019/04/13/reinforced-generation/

Estimators, Loss Functions, Optimizers —Core of ML Algorithms:  https://towardsdatascience.com/estimators-loss-functions-optimizers-core-of-ml-algorithms-d603f6b0161a

One LEGO at a Time: Explaining the Math of how Neural Networks Learn with Implementation from Scratch: https://medium.com/towards-artificial-intelligence/one-lego-at-a-time-explaining-the-math-of-how-neural-networks-learn-with-implementation-from-scratch-39144a1cf80

Advanced Topics in Deep Convolutional Neural Networks: https://towardsdatascience.com/advanced-topics-in-deep-convolutional-neural-networks-71ef1190522d

Understanding Tensor Processing Units: https://medium.com/sciforce/understanding-tensor-processing-units-10ff41f50e78

Must-Read Papers on GANs: https://towardsdatascience.com/must-read-papers-on-gans-b665bbae3317

 

 

 

 

https://www.qichacha.com/postnews_552e6fa019bcae0d9a629a6e4758e8e9.html

http://www.percent.cn/Case.html

https://data.aliyun.com/product/product_index

http://www.apusic.com/bigdata/analysics

https://www.leiphone.com/news/201706/oSbBAbYPoDrdzsIY.html

https://www.xdatainsight.com/portal/html/anli.html?type=1

https://tech.antfin.com/articles/98

天云大数据 http://www.beagledata.com/?page_id=3918

百度数据科学平台: https://cloud.baidu.com/product/jarvis.html

http://www.esensoft.com/products/wonderdm.html

国云大数据魔镜: http://www.moojnn.com/

 

https://www.jiqizhixin.com/articles/2019-01-07-8

https://zihaocode.com/2019/03/10/gnn-review-2/

 

https://scontent.cdninstagram.com/hphotos-xpf1/t51.2885-15/s320x320/e15/11378964_1594647057481765_1505535277_n.jpg 

转载于:https://www.cnblogs.com/czhwust/p/resource.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值