- 博客(1)
- 资源 (2)
- 收藏
- 关注
原创 《Preservation of the Global Knowledge by Not-True Self Knowledge Distillation in Federated Learning》
《Preservation of the Global Knowledge by Not-True Self Knowledge Distillation in Federated Learning》 Abstract 1⃣️ Federated Learning’s convergence suffers from data heterogeneity. 2⃣️ forgetting could be the bottleneck of global convergence. 3⃣️ Continual
2021-12-19 20:25:46 799
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人