Meta’s Llama models get 350 million downloads

Start

Meta’s open weights Llama family of large language models (LLMs) has seen a ten-fold rise in downloads year-on-year, showcasing the popularity of the models since it was first launched 18 months ago, the company announced in a blog post.

“Llama models are approaching 350 million downloads to date (more than 10x the downloads compared to this time last year), and they were downloaded more than 20 million times in the last month alone, making Llama the leading open source model family,” Ahmad Al-Dahle, vice president of generative AI at Meta, wrote in a blog post.

The downloads number was taken from Hugging Face — a company that provides or lists various LLMs for enterprises to use across multiple use cases.

Llama models are referred to as open or open weights as there is no true definition of what an open source LLM should mean, and Meta doesn’t allow free commercial use of its models.

Also, Meta is providing the number of downloads as a figure for showcasing the popularity of its models because they don’t have any other metric, such as monthly or weekly active users, to keep track.

The rationale here is that open source or open models can’t keep of track of users as they are providing the datasets and model weights for enterprises and users to download in contrast to rival closed or proprietary model providers such as OpenAI.

OpenAI, also, reportedly has dropped its latest user figures. According to a report filed by The Information, the OpenAI has 200 million weekly active users of ChatGPT.

The surge in the number of model downloads can be attributed to a variety of factors, including the newer releases of the model and the company’s efforts to increase partners or distributors of the model.

Since launch, Meta has released at least three new versions of Llama with Llama 2 releasing in July last year, followed by the launch of Llama 3 in April this year, and finally the release of Llama 3.1 this July.

The recent 20 million downloads could be seen as an effect of the company’s Llama 3.1 update that included a 405 billion parameter model as well as 70 billion parameter and 8 billion parameter variants — all of which performed better on various benchmarking tests, such as MATH and HumanEval.

“Hosted Llama usage by token volume across our major cloud service provider partners more than doubled May through July 2024 when we released Llama 3.1,” Al-Dahle wrote, adding that the company’s largest variant of LLM, the 405 billion parameter variant, was also gaining traction.

Separately, Meta has been actively trying to increase the number of partners that either host or distribute the Llama family of models. These partners include the likes of AWS, Azure, Google Cloud Platform, Databricks, Dell, Google Cloud, Groq, NVIDIA, IBM watsonx, Scale AI, and Snowflake among others.

“We’ve grown the number of partners in our Llama early access program by 5x with Llama 3.1 and will do more to meet the surging demand from partners,” Al-Dahle wrote, adding that new partners, such as Wipro, Cerebras, and Lambda would be added soon. Additionally, Meta’s vice president said that companies such as Accenture, DoorDash, Goldman Sachs, Shopify, and Zoom were actively using Llama models for their generative AI use cases.

Previous Story

How to protect your privacy in Windows 10

Next Story

Apple might take a stake in OpenAI as it gets intelligent on AI