Research and commercial organizations can use the upgraded, open-source Meta AI engine for free.
Meta is making its Llama 2 large language model open source, the Facebook parent company announced on July 18. The update to the model, which had been released as the first-generation LLaMA (also stylized as Llama 1) in February 2023, was first revealed at the Microsoft Inspire event. Microsoft will be a preferred partner with Meta on Llama 2.
Jump to:
What is Llama 2?
Llama 2 is a large language model that can be used to create generative and conversational AI models. Put simply, Llama 2, like GPT-4, can be used to build chatbots and AI assistants for commercial or research purposes.
It runs on a collection of pre-trained and fine-tuned generative text models that vary in scale from 7 billion to 70 billion parameters, and 2 trillion tokens of data from publicly available sources went into its pre-training. Overall, that’s 40% more tokens than were used to train the original Llama.
SEE: Hiring kit: Prompt engineer (TechRepublic Premium)
Where is Llama 2 available?
Llama 2 can be downloaded for research and commercial use from Meta here. The open-source resources available include model weights and starting code for the pre-trained model as well as fine-tuned versions of the conversational AI.
“Opening access to today’s AI models means a generation of developers and researchers can stress test them, identifying and solving problems fast, as a community,” Meta wrote in a blog post about Llama 2. “By seeing how these tools are used by others, our own teams can learn from them, improve those tools, and fix vulnerabilities.”
Developers who already have accounts with Microsoft’s Azure AI model catalog will be able to access Llama 2 from there. It can be found on Amazon Web Services, Hugging Face and other AI marketplaces. AWS customers should look for it in the machine learning marketplace SageMaker.
“Meta’s announcement of the model being available in AWS and Microsoft Azure is a huge step for them, showing an ambition to be an enterprise player in the generative AI space,” Gartner analyst Arun Chandrasekaran commented in an email to TechRepublic.
Meta partners with Qualcomm for on-device AI
Qualcomm will install Llama 2 on select devices in 2024. The exact device models this will apply to have not yet been revealed, but Qualcomm has said these will be devices powered by Snapdragon processors. Qualcomm aims to run the language model on some devices directly, not always on the cloud.
“We applaud Meta’s approach to open and responsible AI and are committed to driving innovation and reducing barriers-to-entry for developers of any size by bringing generative AI on-device,” said Durga Malladi, senior vice president and general manager of technology, planning and edge solutions Qualcomm, in a press release. “To effectively scale generative AI into the mainstream, AI will need to run on both the cloud and devices at the edge, such as smartphones, laptops, vehicles, and IoT devices.”
What does Llama 2 say about competition in the generative AI business space?
Opening Llama 2 up and locking in a partnership with Microsoft could be a sign that Meta is trying to remain competitive with GPT-4. OpenAI’s GPT-4 is also free to use and is the model behind ChatGPT, which Microsoft has bet on in a big way. Google also has a horse in the ring with the PaLM model behind Bard.
“This is going to change the landscape of the LLM market,” Meta’s Chief AI Scientist Yann LeCun said on Twitter.
How the connection with Meta might shift Microsoft’s deals with OpenAI is unclear as of now, but “the partnership with Meta could open newer opportunities for them,” Chandrasekaran said. “The Llama 2 models can potentially drive demand for Azure’s IaaS and operational tools as customers seek to fine-tune these models and build business applications on top of them,” he added.
Making the model open source could be a sea change, too. “By releasing Llama 2 and licensing it for commercial use, Meta might be providing a huge boost to the open-source community,” Chandrasekaran said. “Today, arguably the closed source models have a performance advantage over open-source models, but Llama has the potential to narrow that gap in the mid to long term.”