大语言模型生成式AI学习笔记——1.3.1 第1周参考资料

本文介绍了本周视频中涉及的生成式AI关键概念,如模型生命周期、Transformer架构的Transformer论文、大规模预训练模型BLOOM和LLMs的最新发展。此外,还涵盖了缩放法则、模型优化和金融领域特定的大型语言模型BloombergGPT等内容。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Week 1 resources(第1周参考资料)

Below you'll find links to the research papers discussed in this weeks videos. You don't need to understand all the technical details discussed in these papers - you have already seen the most important points you'll need to answer the quizzes in the lecture videos.

However, if you'd like to take a closer look at the original research, you can read the papers and articles via the links below.

  1. Generative AI Lifecycle
  1. Transformer Architecture
  • Attention is All You Need - This paper introduced the Transformer architecture, with the core “self-attention” mechanism. This article was the foundation for LLMs.
  • BLOOM: BigScience 176B Model - BLOOM is a open-source LLM with 176B parameters trained in an open and transparent way. In this paper, the authors present a detailed discussion of the dataset and process used to train the model. You can also see a high-level overview of the model
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值