一、总结
原文:BERT 蒸馏在垃圾舆情识别中的探索:https://mp.weixin.qq.com/s/ljYPSK20ce9EoPbfGlaCrw
二、其他资料
- Distilling Task-Specific Knowledge from BERT into Simple Neural Networks论文学习:https://blog.csdn.net/qq_16949707/article/details/115300853
- 匹配模型蒸馏,bilstm,数据增强,包括代码
- Distilling the Knowledge in a Neural Network论文学习:
- https://blog.csdn.net/qq_16949707/article/details/114258149
- 最基础的模型蒸馏的文章
- nlp bert 模型蒸馏大全和工具
- bert蒸馏
- https://blog.csdn.net/qq_16949707/article/details/112982348