Meta challenges Nvidia’s dominance with new AI chips
地区:
  类型:
  时间:2024-09-23 14:27:21
剧情简介

Social media giant Meta has unveiled its second-generation artificial intelligence (AI) chips, which it plans to deploy in-house later this year. The Meta Training and Inference Accelerator (MTIA) is part of the company’s larger plans to build AI infrastructure and use it in its services, such as Facebook, Instagram, and WhatsApp.

The company first announced the MTIA version 1 (v1) in May 2023, but production plans were pushed back to 2025. Meta turned to industry favorite NVidia for the H100 processors to power its AI operations. According to a Reuters report, the company plans to acquire 350,000 H100 chips for AI applications. It will eventually have 600,000 AI chips powering its AI services.

Featured Video Related

Mark Zuckerberg’s company, however, is also looking to reduce its dependence on Nvidia by switching to its in-house-designed chips later this year. The MTIA chips are aimed at data centers. Meta is keen to break into the largely monopolistic market of AI-powered chips currently held by Nvidia.

How powerful are MTIA v2 chips?

The next-gen MTIA chips use the 5nm architecture and consist of an 8×8 grid of processing elements (PEs), a 3.5x improvement over v1. Meta has tripled the size of the local PE storage and doubled the size of the on-chip SRAM to 256 MB while increasing its bandwidth by 3.5x, a press release said.

Meta is also working on supporting infrastructure for the chips and has developed a hardware rack that can hold 72 accelerators. The rack has three chassis with 12 boards that can house two accelerators each. The chips can clock 1.35 GHz, compared to the 800 MHz of their predecessor, and run at 90 watts.

The design is intended to provide more computing power, bandwidth, and memory capacity to the chips. Initially, Meta aimed to perform inference functions such as ranking and generating responses to user prompts. Meta plans to use the chips for more intense operations, such as training AI models using large data sets.

MTIA chip
The MTIA v2 chip. Image credit: Meta

A shift to its chips could help Meta save millions in energy costs every year, alongside the billions needed in capital expenditure to buy chips from Nvidia.

Breaking Nvidia’s monopoly

Meta isn’t the only tech company looking to design and build its own AI chips. Legacy chipmaker Intel, which has lagged in catering to industry requirements for AI chips, also announced its new Gaudi chips at an event on Tuesday.

Intel claims its dedicated AI chip can train AI models three times faster than Nvidia’s H100 processors and generate responses faster than its Nvidia counterpart. The company’s latest chip, Gaudi 3, also uses the 5nm architecture and consists of two main processors fused together to deliver performance twice the speed of its predecessor.

Google’s Tensor Processing Units (TPUs) are the only significant competition to Nvidia’s processor. However, Google does not sell its processors. Instead, it allows developers access through its cloud platform.

The search engine giant recently announced that a new Arm-based central processing unit (CPU) dubbed Axion will also be available on its cloud platform. Google claims that the processing power of the CPU is 50 percent better than the x86 chips and 30 percent better than general Arm chips.

111次播放
18人已点赞
8人已收藏
明星主演
曹沁芳
上松秀実
李俊杰
最新评论(426+)

黄光亮

发表于1分钟前

回复 俞静 :


王珞丹

发表于8分钟前

回复 小曾 :


言承旭

发表于4分钟前

回复 金世晃 :


猜你喜欢
Meta challenges Nvidia’s dominance with new AI chips
热度
78
点赞

友情链接: