- 博客(5)
- 收藏
- 关注
原创 字节跳动算法实习三面过面经
字节跳动算法实习三面过面经自我介绍一面二面三面HR面数不清这是第几次面字节了,之前都没怎么进过二面这次应该是HR把我从建立池里捞出来一面是国际社区产品挂了,转到抖音电商,再面了两次过了据说这个团队在扩张HR特别好,一面挂后帮我捞到另一个团队,并且提醒我复习薄弱的地方自我介绍国内top2本科大三数据科学,大二开始搞nlp,进过两个实验室20年暑假参加ByteCamp做了一个nlp项目,购房queryNER,用了Bert+CRF目前在实验室搞多选阅读理解,无论文,在第四范式做平台研发实习GP
2021-01-23 20:03:54
1849
原创 2020.10.4论文笔记
2020.10.4论文笔记PRETRAINED ENCYCLOPEDIA: WEAKLY SUPERVISED KNOWLEDGE-PRETRAINED LANGUAGE MODELNON-AUTOREGRESSIVE NEURAL MACHINE TRANSLATIONInsertion Transformer: Flexible Sequence Generation via Insertion OperationsLevenshtein TransformerNon-Autoregressive Ne
2020-10-04 21:21:54
801
原创 2020.9.25论文笔记
2020.9.25论文笔记ELECTRA: PRE-TRAINING TEXT ENCODERS AS DISCRIMINATORS RATHER THAN GENERATORSPOINTER: Constrained Text Generation via Insertion-based Generative Pre-trainingPLUG AND PLAY LANGUAGE MODELS: A SIMPLE APPROACH TO CONTROLLED TEXT GENERATIONCoCon: A
2020-09-27 19:16:10
584
原创 2020.9.12论文笔记
2020.9.12论文笔记PEGASUS:Pre-training with Extracted Gap-sentences for Abstractive SummarizationDiscriminative Adversarial Search for Abstractive Summarization两篇来自ICML 2020的关于Abstractive Summarization的论文PEGASUS:Pre-training with Extracted Gap-sentences for A
2020-09-13 17:32:58
274
原创 2020.7.13学习笔记
2020.7.13论文笔记TransformerBERTNLP FROM SCRATCH: CLASSIFYING NAMES WITH A CHARACTER-LEVEL RNNTransformer看了知乎上很详细的解释link和哈佛的注释link这里面有些代码陈旧了需要略微修改稍微理解了原文的思想BERTNLP FROM SCRATCH: CLASSIFYING NAMES WITH A CHARACTER-LEVEL RNN按pytorch的文档link搭了一个模型,自己写了一些
2020-07-19 20:38:13
172
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人