Meta’s AI Ambitions: Llama 4, DeepSeek’s Challenge, and Infrastructure Investments

Meta’s Llama 4 family of large language models (LLMs) is making “great progress in training,” according to CEO Mark Zuckerberg, who announced that Llama 4 Mini has completed pretraining. He believes this development will unlock new AI use cases, but acknowledged that monetization from AI at Meta won’t significantly materialize until 2026.

Open-Source Standards and DeepSeek’s Entry

Zuckerberg addressed the emergence of DeepSeek, a new competitor that released its models under an MIT license. He stated that there will be a global open-source standard and emphasized the importance of ensuring that “it’s an American standard” for national advantage.

“We want to build the AI system that people around the world are using, and some of the recent news has only strengthened our conviction that open-source is the right thing for us to be focused on,” he added.

Zuckerberg confirmed that Llama 4 will be natively multimodal, describing it as an “omni-model” with agentic capabilities, which suggests new functionalities and applications. While he didn’t specify a release date, his comments suggest a full-fledged rollout is unlikely before Q2 2025.

Meta’s Capital Expenditures (CapEx) Outlook

Meta’s projected CapEx for 2025 is set between $60 billion and $65 billion. CFO Susan Li confirmed that servers will be the largest growth driver within the CapEx budget, particularly to support AI investments. However, she also noted expanding non-AI infrastructure.

One notable change in Meta’s accounting strategy includes increasing the estimated useful life of servers and network assets to 5.5 years, effective fiscal year 2025. This adjustment is expected to reduce depreciation expenses by approximately $2.9 billion.

See also  Zomato Unveils Nugget: AI-Powered Customer Support Revolution

Data Center Expansion and Custom Silicon Development

Meta is ramping up data center spending in 2025 to support large AI training clusters and high-power density data centers.

According to Meta, its latest 2GW capacity data center—sufficient to power approximately five million homes—is in development.

The company is also making strides in custom silicon development for AI workloads. Zuckerberg highlighted Meta’s goal of optimizing compute efficiency by designing custom hardware tailored to its specific needs, including memory, network bandwidth, and compute requirements.

DeepSeek’s Impact on CapEx Strategy

When asked whether DeepSeek’s innovations could render Meta’s extensive CapEx plans unnecessary, Zuckerberg admitted that Meta is still analyzing DeepSeek’s advancements. He acknowledged that some novel techniques from competitors might be integrated into Meta’s own infrastructure.

However, he maintained that significant CapEx investments in infrastructure remain a strategic advantage. “There are a bunch of trends happening all at once. I continue to think that investing heavily in CapEx and infrastructure is going to be a long-term advantage,” Zuckerberg said.

Additionally, he noted that the largest compute investments won’t necessarily be in pre-training but instead in inference compute, which enhances AI’s intelligence and service quality.

Meta’s Financial Performance

Meta reported a Q4 net income of $20.8 billion, reflecting a 49% year-on-year increase. Revenue for the quarter stood at $48.3 billion, up 21%. Meta’s headcount also grew 10% year-on-year, reaching 74,067 employees by December 31, 2024.

Meta remains committed to leading the AI space with Llama 4 while emphasizing the importance of American-led open-source standards. With multi-billion-dollar CapEx investments, custom silicon development, and massive data center expansions, Meta is positioning itself for long-term AI dominance. However, new competitors like DeepSeek could challenge the efficiency of this strategy, making 2025 a crucial year for the tech giant’s AI and infrastructure ambitions.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.