Meta challenges Nvidia’s dominance with new AI chips

关于我们 2024-09-22 17:32:01 75

Social media giant Meta has unveiled its second-generation artificial intelligence (AI) chips, which it plans to deploy in-house later this year. The Meta Training and Inference Accelerator (MTIA) is part of the company’s larger plans to build AI infrastructure and use it in its services, such as Facebook, Instagram, and WhatsApp.

The company first announced the MTIA version 1 (v1) in May 2023, but production plans were pushed back to 2025. Meta turned to industry favorite NVidia for the H100 processors to power its AI operations. According to a Reuters report, the company plans to acquire 350,000 H100 chips for AI applications. It will eventually have 600,000 AI chips powering its AI services.

Featured Video Related

Mark Zuckerberg’s company, however, is also looking to reduce its dependence on Nvidia by switching to its in-house-designed chips later this year. The MTIA chips are aimed at data centers. Meta is keen to break into the largely monopolistic market of AI-powered chips currently held by Nvidia.

How powerful are MTIA v2 chips?

The next-gen MTIA chips use the 5nm architecture and consist of an 8×8 grid of processing elements (PEs), a 3.5x improvement over v1. Meta has tripled the size of the local PE storage and doubled the size of the on-chip SRAM to 256 MB while increasing its bandwidth by 3.5x, a press release said.

Meta is also working on supporting infrastructure for the chips and has developed a hardware rack that can hold 72 accelerators. The rack has three chassis with 12 boards that can house two accelerators each. The chips can clock 1.35 GHz, compared to the 800 MHz of their predecessor, and run at 90 watts.

The design is intended to provide more computing power, bandwidth, and memory capacity to the chips. Initially, Meta aimed to perform inference functions such as ranking and generating responses to user prompts. Meta plans to use the chips for more intense operations, such as training AI models using large data sets.

MTIA chip
The MTIA v2 chip. Image credit: Meta

A shift to its chips could help Meta save millions in energy costs every year, alongside the billions needed in capital expenditure to buy chips from Nvidia.

Breaking Nvidia’s monopoly

Meta isn’t the only tech company looking to design and build its own AI chips. Legacy chipmaker Intel, which has lagged in catering to industry requirements for AI chips, also announced its new Gaudi chips at an event on Tuesday.

Intel claims its dedicated AI chip can train AI models three times faster than Nvidia’s H100 processors and generate responses faster than its Nvidia counterpart. The company’s latest chip, Gaudi 3, also uses the 5nm architecture and consists of two main processors fused together to deliver performance twice the speed of its predecessor.

Google’s Tensor Processing Units (TPUs) are the only significant competition to Nvidia’s processor. However, Google does not sell its processors. Instead, it allows developers access through its cloud platform.

The search engine giant recently announced that a new Arm-based central processing unit (CPU) dubbed Axion will also be available on its cloud platform. Google claims that the processing power of the CPU is 50 percent better than the x86 chips and 30 percent better than general Arm chips.

本文地址:http://1.zzzogryeb.bond/html/47a999262.html
版权声明

本文仅代表作者观点,不代表本站立场。
本文系作者授权发表,未经许可,不得转载。

全站热门

味道真系正!怀集食材邂逅顺德厨艺

11 of the weirdest DALL

Sinner, de Minaur vie for Toronto title

5 things you need to know about Punggye

U.S. Senators call on FTC to investigate the security of drivers' data

增城龙须菜热销北京上海!7元一斤采购商抢着订

Hackers can hijack Philips Hue smart bulbs to take over your home

Thiem in first final since 2020

友情链接