1年增长10倍,开源大模型时代高速发展

//编者:     Meta官方宣布 Llama模型最新下载量超过3.5亿次,表明了大模型底座的进化速度和渗透率,以10倍速进化,开发者需要充分关注这一趋势带来的影响,未来的应用都将被大模型驱动。

Key Takeaways: 关键要点:
  • Llama models are approaching 350 million downloads to date (more than 10x the downloads compared to this time last year), and they were downloaded more than 20 million times in the last month alone, making Llama the leading open source model family.
    到目前为止,Llama模型的下载量接近3.5亿次(与去年同期相比,下载量超过10倍),仅上个月就被下载了2000多万次,使Llama成为领先的开源模型家族。
  • Llama usage by token volume across our major cloud service provider partners has more than doubled in just three months from May through July 2024 when we released Llama 3.1.
    从2024年5月到7月,我们发布Llama 3.1时,我们主要云服务提供商合作伙伴的Llama使用量在短短三个月内翻了一番多。
  • Monthly usage (token volume) of Llama grew 10x from January to July 2024 for some of our largest cloud service providers.
    从2024年1月到7月,我们一些最大的云服务提供商的Llama月使用量(令牌量)增长了10倍。

It’s been just over a month since we released Llama 3.1, expanding context length to 128K, adding support across eight languages, and introducing the first frontier-level open source AI model with our Llama 3.1 405B. As we did with our Llama 3 and Llama 2 releases, today we’re sharing an update on the momentum and adoption we’re seeing across the board.
我们发布Llama 3.1已经一个多月了,它将上下文长度扩展到128K,增加了对八种语言的支持,并通过我们的Llama 3.1 405B引入了第一个前沿级开源AI模型。正如我们在Llama 3Llama 2发布时所做的那样,今天我们分享了我们在董事会看到的势头和采用的最新情况。

The success of Llama is made possible through the power of open source. By making our Llama models openly available we’ve seen a vibrant and diverse AI ecosystem come to life where developers have more choice and capability than ever before. The innovation has been broad and rapid, from start-ups pushing new boundaries to enterprises of all sizes using Llama to build on-premises or through a cloud service provider. Industry is building and innovating with Llama, and we’re even more excited for what’s to come.
Llama的成功得益于开源的力量。通过公开提供我们的美洲驼模型,我们看到了一个充满活力和多元化的人工智能生态系统的诞生,开发人员比以往任何时候都有更多的选择和能力。创新是广泛而快速的,从初创企业到各种规模的企业,都在使用Llama进行内部部署或通过云服务提供商进行构建。工业界正在与Llama一起建设和创新,我们对即将到来的事情感到更加兴奋。

Alongside the release of Llama 3.1, Mark Zuckerberg shared an open letter on the benefits of open source AI—further cementing our vision and commitment to an open approach. Open source is in our company’s DNA, and Llama both embodies and reinforces our commitment to sharing our work in a responsible way. Open source promotes a more competitive ecosystem that’s good for consumers, good for companies (including Meta), and ultimately good for the world.
在发布Llama 3.1的同时,Mark Zuckerberg还分享了一封关于开源AI好处的公开信,进一步巩固了我们对开放方法的愿景和承诺。开源是我们公司的DNA,Llama体现并加强了我们以负责任的方式分享工作的承诺。开源促进了一个更具竞争力的生态系统,这对消费者有利,对公司(包括Meta)有利,最终对世界有利。

In just 18 months since our initial launch, Llama has evolved from a single state-of-the-art foundation model to a robust system for developers. With Llama 3.1, we now offer developers a complete reference system to more easily create their own custom agents along with a new set of security and safety tools to help build responsibly.
我们最初推出的短短18个月内,Llama已经从一个最先进的基础模型发展成为一个面向开发人员的强大系统。通过Llama 3.1,我们现在为开发人员提供了一个完整的参考系统,可以更轻松地创建自己的自定义代理沿着一套新的安全和安全工具,以帮助负责任地构建。

The leading open source model
领先的开源模式

The Llama ecosystem is growing rapidly. Llama models are approaching 350 million downloads on Hugging Face to date—an over 10x increase from where we were about a year ago. Llama models were downloaded more than 20 million times on Hugging Face in the last month alone. And this is just one piece of the Llama success story with these models also being downloaded on services from our partners across the industry.
美洲驼的生态系统正在迅速发展。到目前为止,骆驼模型在拥抱脸上的下载量接近3.5亿次,比一年前增加了10倍多。仅在上个月,美洲驼模型就在拥抱脸上被下载了2000多万次。这只是美洲驼成功故事的一部分,这些模型也被下载到我们整个行业的合作伙伴的服务上。

In addition to Amazon Web Services (AWS) and Microsoft’s Azure, we’ve partnered with Databricks, Dell, Google Cloud, Groq, NVIDIA, IBM watsonx, Scale AI, Snowflake, and others to better help developers unlock the full potential of our models. Hosted Llama usage by token volume across our major cloud service provider partners more than doubled May through July 2024 when we released Llama 3.1.
除了Amazon Web Services(AWS)和Microsoft Azure之外,我们还与Databricks、Dell、Google Cloud、Groq、NVIDIA、IBM watsonx、Scale AI、Snowflake等合作,以更好地帮助开发人员释放我们模型的全部潜力。在我们发布Llama 3.1时,我们主要云服务提供商合作伙伴的托管Llama使用量在2024年5月至7月期间增加了一倍以上。

Monthly usage of Llama grew 10x from January to July 2024 for some of our largest cloud service providers. And in the month of August, the highest number of unique users of Llama 3.1 on one of our major cloud service provider partners was the 405B variant, which shows that our largest foundation model is gaining traction.
从2024年1月到7月,我们一些最大的云服务提供商的Llama月使用量增长了10倍。在8月份,我们主要云服务提供商合作伙伴之一的Llama 3.1的最高独立用户数是405B版本,这表明我们最大的基础模型正在获得吸引力。

We’ve grown the number of partners in our Llama early access program by 5x with Llama 3.1 and will do more to meet the surging demand from partners. We’ve heard from a number of companies that want to be future LEAP and integration Llama partners, including Wipro, Cerebras, and Lambda.
通过Llama 3.1,我们的Llama抢先体验计划的合作伙伴数量增加了5倍,并将采取更多措施来满足合作伙伴激增的需求。我们已经收到了一些公司的来信,他们希望成为未来LEAP和集成Llama的合作伙伴,包括Wipro、Cerebras和Lambda。

Swami Sivasubramanian, VP, AI and Data, AWS: “Customers want access to the latest state-of-the-art models for building AI applications in the cloud, which is why we were the first to offer Llama 2 as a managed API and have continued to work closely with Meta as they released new models. We’ve been excited to see the uptake for Llama 3.1 from customers across both Amazon SageMaker and Amazon Bedrock, and we look forward to seeing how customers use this model to solve their most complex use cases.”
AWS人工智能和数据副总裁Swami Sivasubramanian表示:“客户希望获得最新的最先进模型,以便在云中构建人工智能应用程序,这就是为什么我们率先将Llama 2作为托管API提供,并在Meta发布新模型时继续与他们密切合作。我们很高兴看到Amazon SageMaker和Amazon Bedrock的客户对Llama 3.1的采用,我们期待看到客户如何使用此模型来解决他们最复杂的用例。

Ali Ghodsi, CEO & Co-Founder, Databricks: “In the weeks since launch, thousands of Databricks customers have adopted Llama 3.1, making it our fastest adopted and best selling open source model ever. This generation of Llama models finally bridges the gap between OSS and commercial models on quality. Llama 3.1 is a breakthrough for customers wanting to build high quality AI applications, while retaining full control, customizability, and portability over their base LLM.”
Databricks首席执行官联合创始人Ali Ghodsi:“自发布以来的几周内,数千名Databricks客户采用了Llama 3.1,使其成为我们有史以来采用最快、最畅销的开源模型。这一代Llama模型最终弥合了OSS和商业模型在质量上的差距。Llama 3.1对于希望构建高质量AI应用程序的客户来说是一个突破,同时保留了对基础LLM的完全控制、可定制性和可移植性。

Jonathan Ross, Founder & CEO, Groq: “Open-source wins. Meta is building the foundation of an open ecosystem that rivals the top closed models and at Groq we put them directly into the hands of the developers—a shared value that’s been fundamental at Groq since our beginning. To date Groq has provided over 400,000 developers with 5 billion free tokens daily, using the Llama suite of models and our LPU Inference. It’s a very exciting time and we’re proud to be a part of that momentum. We can’t add capacity fast enough for Llama. If we 10x’d the deployed capacity it would be consumed in under 36 hours.”
Groq创始人兼首席执行官Jonathan Ross:“开源是赢家。Meta正在构建一个开放的生态系统的基础,可以与顶级的封闭模型相媲美,在Groq,我们将它们直接交给开发人员--这是一个共同的价值观,从一开始就是Groq的基础。到目前为止,Groq已经使用Llama模型套件和我们的LPU推理,每天为超过40万名开发者提供了50亿个免费代币。这是一个非常激动人心的时刻,我们很自豪能成为这一势头的一部分。我们不能为美洲驼增加足够快的容量。如果我们将部署的容量增加10倍,它将在36小时内消耗完。

Jensen Huang, Founder & CEO of NVIDIA: “Llama has profoundly impacted the advancement of state-of-the-art AI. The floodgates are now open for every enterprise and industry to build and deploy custom Llama supermodels using NVIDIA AI Foundry, which offers the broadest support for Llama 3.1 models across training, optimization, and inference. It’s incredible to witness the rapid pace of adoption in just the past month.”
NVIDIA创始人兼首席执行官詹森·黄(Jensen Huang)表示:“美洲驼对最先进的人工智能的发展产生了深远的影响。现在,每个企业和行业都可以使用NVIDIA AI Foundry构建和部署定制的Llama超级模型,该模型在训练,优化和推理方面为Llama 3.1模型提供了最广泛的支持。令人难以置信的是,在过去的一个月里,我们目睹了收养的快速步伐。“

What's even more encouraging than how many people are using Llama is who is using Llama and how they’re using Llama.
比有多少人在使用美洲驼更令人鼓舞的是谁在使用美洲驼以及他们如何使用美洲驼。

We’re seeing growing preference in the developer community for Llama and strong indicators for continued growth. According to a survey from Artificial Analysis, an independent site for AI benchmarking, Llama was the number two most considered model and the industry leader in open source.
我们看到开发者社区对Llama的偏好越来越大,并且持续增长的指标也越来越强。根据人工智能基准测试独立网站Artificial Analysis的一项调查,Llama是第二大最受关注的模型,也是开源领域的行业领导者。

With more than 60,000 derivative models on Hugging Face, there’s a vibrant community of developers fine-tuning Llama for their own use cases. Large enterprises like AT&T, DoorDash, Goldman Sachs, Niantic, Nomura, Shopify, Spotify, and Zoom are just a few success stories, and both Infosys and KPMG are using Llama internally.
Hugging Face上有超过60,000个衍生模型,有一个充满活力的开发人员社区为他们自己的用例微调Llama。AT&T、DoorDash、Goldman Sachs、Niantic、Nomura、Shopify、Spotify和Zoom等大型企业只是其中的几个成功案例,Infosys和KPMG都在内部使用Llama。

Let’s take a closer look.
让我们仔细看看。

A snapshot of Llama case studies
LLama案例研究快照

Accenture is using Llama 3.1 to build a custom LLM for ESG reporting that they expect to improve productivity by 70% and quality by 20 – 30%, compared with the company’s existing way of generating Accenture’s annual ESG report. With its exciting advancements in multilingual capabilities, Accenture is able to extend AI models across regions, for example to help a global organization make chatbots more culturally conscious and relevant. Accenture believes companies will need to leverage many different AI models from different providers. Open source models like Llama 3.1 expand options, accelerate innovation, and will have a positive ripple effect across business and society.
埃森哲正在使用Llama 3.1为ESG报告构建自定义LLM,与公司现有的生成埃森哲年度ESG报告的方式相比,他们预计将提高70%的生产力和20 - 30%的质量。凭借其在多语言功能方面令人兴奋的进步,埃森哲能够跨地区扩展人工智能模型,例如帮助全球组织使聊天机器人更具文化意识和相关性。埃森哲认为,企业将需要利用来自不同提供商的许多不同的人工智能模型。像Llama 3.1这样的开源模式扩大了选择,加速了创新,并将在整个商业和社会中产生积极的涟漪反应。

Customer care is an area of focus for AI-powered innovation at AT&T. Through fine-tuning Llama models, they’ve been able to cost effectively improve customer care by better understanding key trends, needs and opportunities to enhance the experience moving forward. Overall, Llama and GenAI have driven a nearly 33% improvement in search-related responses for AT&T customer care engagements while reducing costs and speeding up response times.
客户关怀是AT T人工智能创新的重点领域。通过微调Llama模型,他们能够更好地了解关键趋势、需求和机会,从而提高客户服务的成本效益,从而改善未来的体验。总体而言,Llama和GenAI在AT&T客户服务的搜索相关响应方面提高了近33%,同时降低了成本并加快了响应时间。

DoorDash uses Llama to streamline and accelerate daily tasks for its software engineers, such as leveraging its internal knowledge base to answer complex questions for the team and delivering actionable pull request reviews to improve its codebase.
DoorDash使用Llama来简化和加速其软件工程师的日常任务,例如利用其内部知识库为团队回答复杂的问题,并提供可操作的拉取请求审查以改进其代码库。

Goldman Sachs AI platform, known as the GS AI Platform, allows Goldman engineers to use Llama models for various use cases in a safe and responsible way, including information extraction from documents.
高盛人工智能平台(GS AI Platform)允许高盛工程师以安全和负责任的方式将Llama模型用于各种用例,包括从文档中提取信息。

To drive the virtual world of its first-of-its-kind AR game Peridot, Niantic integrated Llama, transforming its adorable creatures, called “Dots,” into responsive AR pets that now exhibit smart behaviors to simulate the unpredictable nature of physical animals. Llama generates each Dot’s reaction in real time, making every interaction dynamic and unique.
为了推动其首款AR游戏橄榄岩的虚拟世界,Niantic整合了Llama,将其可爱的生物“Dots”转变为反应灵敏的AR宠物,这些宠物现在表现出智能行为,以模拟物理动物不可预测的性质。Llama在真实的时间内生成每个Dot的反应,使每个交互都是动态和独特的。

Leading Japanese financial institution Nomura uses Llama on AWS for key benefits, including faster innovation, transparency, bias guardrails, and robust performance across text summarization, code generation, log analysis, and document processing.
领先的日本金融机构野村证券(Nomura)在AWS上使用Llama,以获得关键优势,包括更快的创新、透明度、偏见护栏以及跨文本摘要、代码生成、日志分析和文档处理的强大性能。

Shopify is continuing to experiment with best-in-class open source models, including LLaVA, which is built on the foundations of Llama. They use finetunes of LLaVA for multiple specialized tasks and are currently doing 40M – 60M Llava inferences per day supporting the company’s work on product metadata and enrichment.
Shopify正在继续试验一流的开源模型,包括建立在Llama基础上的LLaVA。他们将LLaVA的微调用于多个专业任务,目前每天进行40 M-60 M Llava推理,支持公司在产品元数据和丰富方面的工作。

Zoom uses its own models as well as closed- and open-source LLMs—including Llama—to power its AI Companion, a generative AI assistant that helps workers avoid repetitive, mundane tasks. AI Companion serves up meeting summaries, smart recordings, and next steps to Zoom users, freeing up more of their time to collaborate, make connections, and get things done.
Zoom使用自己的模型以及包括Llama在内的封闭和开源LLMs来为其AI Companion提供动力,这是一种生成式AI助手,可帮助员工避免重复的平凡任务。AI Companion为Zoom用户提供会议摘要、智能录音和后续步骤,让他们有更多的时间进行协作、建立联系和完成任务。

A thriving open system 一个繁荣的开放系统

Llama is leading the way on openness, modifiability, and cost efficiency. We’re committed to building in the open and helping ensure that the benefits of AI extend to everyone. And a growing number of academics and entrepreneurs alike agree that open source AI is the right path forward.
Llama在开放性、可修改性和成本效益方面处于领先地位。我们致力于开放式建设,并帮助确保AI的好处扩展到每个人。越来越多的学者和企业家都认为开源AI是正确的前进道路。

LLMs can help us answer tough questions, improve our productivity, and spark our creativity. As the Llama ecosystem expands, so, too, do the capabilities and accessibility of Meta AI. Our smart assistant is available across Instagram, WhatsApp, Messenger, and Facebook, as well as via the web. We’ve also brought it to Meta Quest and the Ray-Ban Meta collection—bringing us a step closer to our vision of a future where an always-available contextual AI assistant in a convenient, wearable form factor will proactively help you as you go about your day.
LLMs可以帮助我们回答坚韧的问题,提高我们的生产力,激发我们的创造力。随着美洲驼生态系统的扩展,Meta AI的功能和可访问性也在扩展。我们的智能助手可通过Instagram、WhatsApp、Messenger和Facebook以及网络使用。我们还将其引入了Meta Quest和Ray-Ban Meta系列,让我们更接近我们对未来的愿景,即一个始终可用的情景AI助手,以方便的可穿戴形式,在您的日常生活中主动为您提供帮助。

We’re excited by the growth of the Llama community and encouraged knowing we're building the most advanced large language models, open sourced for the world today. Stay tuned to the blog in the weeks and months ahead as we continue spotlighting all the incredible ways developers and companies are finding value with Llama.
我们对Llama社区的增长感到兴奋,并鼓励我们构建最先进的大型语言模型,为当今世界开源。在未来的几周和几个月里,请继续关注我们的博客,因为我们将继续关注开发人员和公司在Llama中发现价值的所有令人难以置信的方式。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值