Broadcom Upgrades Jericho Data Centre Chip For AI Age


Broadcom on Monday released its next-generation Jericho networking chip, which takes aim at artificial intelligence workloads by speeding up traffic over long distances, allowing sites up to 60 miles (100 km) apart to be linked together.

The Jericho4 also improves security by encrypting data, a critical concern when data is passing outside a facility’s physical walls.

The new chip, which uses TSMC’s 3 nanometre manufacturing process, incorporates high-bandwidth memory (HMB), which has also become the preferred memory type for AI accelerator chips from the likes of Nvidia and AMD.

Image credit: Microsoft

Multiple sites

The increased memory is another factor enabling the network chips to transfer data over long distances, said Ram Velaga, senior vice president and general manager of Broadcom’s Core Switching Group.

Essentially the Jericho4 allows multiple, smaller data centres to be linked up into a single, more powerful system, Broadcom said.

This creates more flexibility in the age of AI, whose workloads require massive computing and electrical power.

Cloud infrastructure companies are looking to link together hundreds of thousands of power-hungry AI chips, but a network of 100,000 or 200,000 GPUs requires more power than is typically available in one physical building, Velaga said.

To make the cluster possible, companies can use Jericho4-based networks to link together clusters of server racks across multiple buildings, he said.

Products based on the chip can also help cloud companies move compute workloads closer to customers by creating data centre sites in congested urban areas, where it may be more practical to link together multiple, smaller sites.

The chip complements Broadcom’s Tomahawk line of chips, which connect racks within a data centre, typically at distances under one kilometer.

Broadcom said the Jericho4 can connect more than 1 million processors across multiple data centres and can handle about four times more data than the previous version.

The company said it began shipping the chip this week to early customer such as cloud providers and networking gear manufacturers, with products using it expected to appear in about nine months.

In-house AI chips

Broadcom also makes custom AI accelerator chips for the likes of Facebook parent Meta Platforms, which is working with Broadcom to build new Santa Barbara AI data centre servers.

Broadcom is supplying the custom processors for the servers, which are being manufactured by Taiwanese firm Quanta Computer, Economic Daily News reported.

Meta has ordered up to 6,000 racks of the Santa Barbara servers, which are to replace its existing Minerva servers, the news outlet reported.

OpenAI is reportedly working with Broadcom and TSMC on its first in-house AI accelerator chip, as part of an effort to diversify its supply of specialised processing power.



Source link

Share

Latest Updates

Frequently Asked Questions

Related Articles

Google to Reportedly Shut Down Support for Steam for Chromebook in 2026

Google is reportedly dropping support for the Steam for Chromebook Beta programme at...

Realtors Are Using AI Images of Homes They’re Selling. Comparing Them to the Real Thing Will Make You Mad as Hell

As if suspiciously AI-generated descriptions of real estate listings weren't enough, agents are...

India tech giant TCS layoffs herald AI shakeup of $283 billion outsourcing sector

Indian outsourcing giant Tata Consultancy Services' decision to cut over 12,000 jobs signals...

AI Tool of the Week

SBOBET88 SABUNG AYAM ONLINE Judi Bola Judi Bola judi bola judi bola judi bola judi bola judi bola slot777 Judi Bola Online Sv388 Judi Bola Online https://doctorsnutritionprogram.com/ https://nielsen-restaurante.com/ https://www.atobapizzaria.com.br/ https://casadeapoio.com.br/ https://bracoalemao.com.br/ https://letspetsresort.com.br/ https://mmsolucoesweb.com.br/ https://procao.com.br/