![](https://img-blog.csdnimg.cn/20201014180756913.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
论文解读
文章平均质量分 91
ZQSZXY
这个作者很懒,什么都没留下…
展开
-
transformer和CNN各种结合方式相关的文章(不完全统计)
transformer和CNN各种结合方式相关的文章(不完全统计)NLP领域相关Conformer: Convolution-augmented Transformer for Speech Recognition[2005.08100] Conformer: Convolution-augmented Transformer for Speech Recognition (arxiv.org)Lite Transformer with Long-Short Range Attention[200原创 2021-11-30 15:17:49 · 1687 阅读 · 0 评论 -
Improving noise robustness of contrastive speech representation learning with speech reconstruction
Improving noise robustness of contrastive speech representation learning with speech reconstruction研究机构: The Ohio State University,Microsoft Corporation文章来源:[2110.15430] Improving Noise Robustness of Contrastive Speech Representation Learning with Speech原创 2021-11-25 17:22:25 · 1758 阅读 · 0 评论 -
JOINT UNSUPERVISED AND SUPERVISED TRAINING FOR MULTILINGUAL ASR
JOINT UNSUPERVISED AND SUPERVISED TRAINING FOR MULTILINGUAL ASR研究机构:google文章来源:[2111.08137] Joint Unsupervised and Supervised Training for Multilingual ASR (arxiv.org)研究背景自监督语音预训练模型在下游任务(如ASR)表现出很好的性能,目前的预训练方法采用2阶段式,包含pre-train和fine-tune。pre-train段主要优化原创 2021-11-21 17:25:40 · 1371 阅读 · 0 评论 -
论文解读:HUBERT HOW MUCH CAN A BAD TEACHER BENEFIT ASR PRE-TRAINING
HUBERT HOW MUCH CAN A BAD TEACHER BENEFIT ASR PRE-TRAINING文章来源:icassp2021[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-JYHYuLUO-1627824995743)(https://raw.githubusercontent.com/zqs01/figurebed/main/img/image-20210801162550871.png)]研究背景:相比于CV和NLP领域,ASR领域中自监督预训原创 2021-08-01 22:38:03 · 1168 阅读 · 0 评论