
Image by Alpha Photo, from Flickr
Meta Begins Testing In-House AI Chip To Cut Nvidia Dependence
Meta has begun testing its first in-house chip for training artificial intelligence systems, marking a strategic shift aimed at reducing its reliance on Nvidia and cutting infrastructure costs, according to an exclusive report from Reuters.
In a Rush? Here are the Quick Facts!
- The chip is a dedicated accelerator designed for AI-specific tasks, improving efficiency.
- Meta is collaborating with TSMC to manufacture the chip after a successful “tape-out.”
- The chip may support AI training for recommendations and generative AI like Meta AI.
The chip, part of Meta’s long-term push toward custom silicon, is currently in a small-scale deployment. If successful, the company plans to expand its use, sources told Reuters.
Meta, the parent company of Facebook, Instagram, and WhatsApp, has been investing heavily in AI, with projected 2025 expenses between $114 billion and $119 billion, including up to $65 billion in AI-related capital expenditures, as reported by Reuters.
The new AI training chip is a dedicated accelerator, optimized for AI-specific tasks, making it potentially more power-efficient than traditional graphics processing units (GPUs).
Reuters reports that Meta is reportedly working with Taiwan Semiconductor Manufacturing Company (TSMC) to produce the chip. The development follows the company’s first successful “tape-out,” an essential step in chip design that involves sending an initial prototype to a manufacturer.
This process can take months and cost tens of millions of dollars, with no guarantee of success. Reuters notes that Meta’s previous attempts at custom AI chips have seen mixed results, including the scrapping of an earlier training chip.
Meta began using an in-house inference chip last year to optimize content recommendations on Facebook and Instagram. The company now aims to extend its custom chip capabilities to AI training, starting with recommendation systems and eventually expanding to generative AI products like Meta AI, as reported by Reuters.
While Meta remains one of Nvidia’s largest customers, recent shifts in AI research have raised questions about the long-term scalability of large language models, potentially influencing the demand for high-powered GPUs.
Leave a Comment
Cancel