(Bloomberg) -- Meta Platforms Inc. is deploying a new homegrown chip to help power its artificial intelligence services, aiming to decrease its reliance on semiconductors from Nvidia Corp. and other outside companies. 

The chip, announced Wednesday, is the latest version of the Meta Training and Inference Accelerator, or MTIA, which helps rank and recommend content across Facebook and Instagram. Meta released the first MTIA product last year. 

Meta’s pivot to AI services has brought increased demand for computing power. Last year, the social media giant released its own version of an AI model to compete with OpenAI’s ChatGPT. It also added new generative AI features to its social apps, including customized stickers and celebrity-faced chatbot characters.

In October, the company said it would spend as much as $35 billion on infrastructure to support AI, including data centers and hardware. “AI will be our biggest investment area in 2024,” Chief Executive Officer Mark Zuckerberg told investors that month.

A significant amount of that spending will likely still flow to Nvidia, which builds the popular H100 graphics cards that power AI models. Earlier this year, Zuckerberg said the company would acquire 350,000 of those chips, which cost tens of thousands of dollars each.

But there’s a growing movement among tech giants to develop chips in-house. Meta is joining rivals Amazon.com Inc.’s AWS, Microsoft Corp. and Alphabet Inc.’s Google in trying to wean themselves off a very expensive dependency. It won’t be a quick fix, though. So far, the efforts haven’t made a dent in the industry’s insatiable need for Nvidia’s AI accelerators.

The AI boom has helped turn Nvidia into the world’s third-most-valuable tech company, behind only Microsoft and Apple Inc. Its sales to data center operators totaled $47.5 billion in fiscal 2024, up from just $15 billion the year before. Analysts predict that the sum will more than double again in fiscal 2025.

©2024 Bloomberg L.P.