大模型-微调技术(六):MAM Adapter(统一框架)(统一Adapter-Tuning、Prefix-Tuning、LoRA)【冻结大模型参数,微调新插入的参数层】
《Towards a Unified View of Parameter-Efficient Transfer Learning》《Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models》《Sparse Structure Search for Parameter-Efficient Tuning》大模型参数高效微调技术原理综述(六)-MAM Adapter、UniP
复制链接