Check out our list of top companies

Check out our carefully compiled lists of the most relevant and impactful companies within their fields.

Check out our list of top unicorns

Read and learn about the biggest companies that various countries have produced, how they made it, and what the future looks like for them.

Zuckerberg: Llama 4 Needs 10x More Compute

Mark Zuckerberg revealed Meta's next-gen AI model, Llama 4, will require ten times more computing power than Llama
August 2, 2024

Meta is ramping up its ambitions in the AI space, signaling a major leap in its computing needs to train future models. During Meta’s second-quarter earnings call, Mark Zuckerberg revealed that training the next-generation Llama 4 model will demand a staggering tenfold increase in computing power compared to its predecessor, Llama 3. This move underscores Meta’s commitment to staying ahead in the rapidly evolving AI landscape.

Zuckerberg emphasized the importance of proactive capacity building. "The computing power required for Llama 4 will likely be nearly ten times more than what was needed for Llama 3,” he stated. “Future models will only increase in their requirements. I'd rather build capacity now and avoid falling behind.”

In April, Meta unveiled Llama 3, a model with 80 billion parameters, followed by the release of Llama 3.1 405B, boasting a massive 405 billion parameters. This latest model represents Meta’s most advanced open-source offering to date. To support such advancements, Meta’s CFO, Susan Li, highlighted the company's plans to expand its data center projects and increase its capital expenditures in 2025, aiming to enhance its AI training infrastructure.

Training large language models comes with a hefty price tag. Meta’s capital expenditures surged by nearly 33% to $8.5 billion in Q2 2024, driven by substantial investments in servers, data centers, and network infrastructure. For comparison, OpenAI reportedly spends $3 billion on model training and an additional $4 billion on server rentals.

Li also touched on Meta’s strategy to balance AI training with core operations. “As we scale our generative AI capabilities, we’re building infrastructure that offers flexibility, allowing us to allocate resources effectively between AI training and other key areas like ranking and recommendations,” she said.

Despite these significant investments, Li tempered expectations regarding immediate revenue impacts from Meta’s AI products, noting that while India remains a key market for Meta AI, generative AI products are not expected to contribute significantly to revenue in the near term. 

Meta's bold move to scale its AI capabilities illustrates its determination to lead in the AI space, preparing for a future where computing power will be crucial to maintaining a competitive edge.

More about:  | | |

Last related articles

chevron-down linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram