周三九的论文笔记
文章平均质量分 94
Levi_Ackerman__
这个作者很懒,什么都没留下…
展开
-
CV系列经典论文(1) -- ResNet: Deep Residual Learning for Image Recognition
Abstract Deeper neural networks are more difficult to train. We present a residual learning framework to ease the training of networks that are substantially deeper than those used previously. We explicitly reformulate the layers as learning residual原创 2021-12-27 15:57:51 · 2459 阅读 · 0 评论 -
面向单目深度估计的基于几何的预训练方式 -- Geometric Pretraining for Monocular Depth Estimation
一些前提知识Monocular Depth Estimation:单目深度估计,从单张图片中去预测每个像素点具体的深度,相当于从二维图像推测出三维空间。ImageNet-Pretraining:基于ImageNet的预训练模型,ImageNet是一个带有标签的大数据集,其中有1,000个类别的图像。CV界常在进行下游任务之前,一般会在ImageNet上进行预训练,以学习到图像的语义信息,便于迁移学习。optical flow:光流,用于研究图像对齐的算法,分为稀疏光流(一般选角点)和稠密光流。原创 2021-12-22 00:06:12 · 3343 阅读 · 1 评论 -
NLP系列经典论文(2) -- BERT: Pre-training of Deep Bidirectional Transformers forLanguage Understanding
首先放原文链接https://arxiv.org/pdf/1810.04805.pdfhttps://arxiv.org/pdf/1810.04805.pdfAbstract We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language原创 2021-12-16 20:57:33 · 1006 阅读 · 1 评论 -
NLP系列经典论文(1)-- Attention Is All You Need
首先放论文原文链接https://arxiv.org/pdf/1706.03762.pdfhttps://arxiv.org/pdf/1706.03762.pdf摘要AbstractThe dominant sequence transduction models are based on complex recurrent or convolutional neural networks that include an encoder and a decoder. The best perf.原创 2021-12-13 17:49:28 · 3649 阅读 · 2 评论