We are officially entering the era of artificial intelligence (AI) and now, Meta has joined in the game with the launch of a Large Language Model Meta AI (LLaMA), a foundational, 65-billion-parameter large language model, according to a statement by the tech company. LLaMA joins the ranks of the world-famous ChatGPT, Google’s Bard and Microsoft’s updated AI-powered Bing as our world moves towards a smarter world.
Below, MARKETING-INTERACTIVE breaks down what LLaMA is, what it can be used for and how it could significantly improve AI-chatbots as we know them.
What is a Large Language Model (LLMS)?
Before we go into what Meta’s LLaMA platform is, it is crucial to understand what a LLMS actually is. LLMS are AI systems that consume volumes of digital text from internet sources such as articles, news reports, and social media posts.
These texts are then used to train software such as ChatGPT in order to predict and produce content based only on a prompt by the user. LLMS then form the backbone of many of the AI-powered chatbots that we are seeing pop up.
What is LLaMA?
LLaMA is a state-of-the-art foundational large language model designed to help researchers advance their work in the subfield of AI, according to Meta. Essentially, this means that it is not exactly a chatbot. Rather, it is a research tool that will help solve issues regarding AI language models.
“Smaller, more performant models such as LLaMA enable others in the research community who don’t have access to large amounts of infrastructure to study these models, further democratizing access in this important, fast-changing field,” wrote Meta in a release.
Meta then noted that even with all the recent advancements in large language models, full research access to them remains limited because of the resources that are required to train and run such large models. “This restricted access has limited researchers’ ability to understand how and why these large language models work, hindering progress on efforts to improve their robustness and mitigate known issues, such as bias, toxicity, and the potential for generating misinformation,” wrote Meta.
To solve that, Meta revealed that it would be training its models, which range from 7B to 65B parameters, on trillions of tokens and while using public databases. This in theory would remove the reliance on proprietary and inaccessible data sets.
It added that like other large language models, LLaMA works by taking a sequence of words as an input and predicts a next word to recursively generate text. “To train our model, we chose text from the 20 languages with the most speakers, focusing on those with Latin and Cyrillic alphabets,” it said.
Why is LLaMA so crucial in an AI-powered space?
Training smaller foundation models such as LLaMA can be extremely helpful in the large language model space because it requires far less computing power and resources to test new approaches, validate others’ work, and explore new use cases, according to Meta.
Foundational language models are known to be trained using larger mounds of data that are unlabelled. This makes them particularly ideal for customising it according to various tasks.
Does it have any limitations?
While Meta does admit that more research needs to be done to address the risks of bias, toxic comments, and hallucinations in most large language models including LLaMA, it would appear that it has been built to allow for researchers to test new approaches to limiting or eliminating these problems in large language models.
“As a foundation model, LLaMA is designed to be versatile and can be applied to many different use cases, versus a fine-tuned model that is designed for a specific task,” Meta explained.
When will it be available?
In order to prevent misuse and to maintain the system’s integrity, Meta is releasing its LLaMA model under a non-commercial license focused on research use cases.
Access to the model will then be granted on a case-by-case basis to academic researchers; those affiliated with organizations in government, civil society, and academia; and industry research laboratories around the world, according to Meta.
The system though is not in use in any of Meta’s products at the moment.
Feishu and Lark for dummies: 101 on the new work collaboration app coming for Slack
Bondee for dummies: 101 on the new SG-based social media app sweeping across Asia
Artifact for dummies: The new text-based AI social media app you need
Get the daily lowdown on Asia's top marketing stories.
We break down the big and messy topics of the day so you're updated on the most important developments in Asia's marketing development – for free.subscribe now open in new window