Featured

Nvidia plans to sell tech to speed AI chip communication

 Nvidia plans to sell a technology that will tie chips together to speed up the chip-to-chip communication needed to build and deploy artificial intelligence tools, it said on Monday.

Nvidia launched a new version of its NVLink tech called NVLink Fusion on Monday that it will sell to other chip designers to help build powerful custom AI systems with multiple chips linked together.

Marvell Technology and MediaTek plan to adopt the NVLink tech called Fusion for their custom chip efforts, the company said. Other partners include Alchip, Fujitsu and Qualcomm.

Nvidia’s NVLink is used to exchange massive amounts of data between various chips, such as in the company’s GB200, which combines two Blackwell graphics processing units with a Grace processor.

Nvida CEO Jensen Huang made the announcement about NVLink Fusion at the Taipei Music Center, site of the Computex AI exhibition that runs from May 20 to 23.

In addition to the announcement on new tech production, Huang disclosed the company’s plan to build a Taiwan headquarters in the northern suburbs of Taipei.

His keynote speech discussed Nvidia’s history of building AI chips, systems and software to support them.

He said his presentations in the past had focused on the company’s graphics chips. Now Nvidia has grown beyond its roots as a video game graphics chip maker into the dominant producer of the chips that have powered the AI frenzy since ChatGPT’s launch in 2022.

Nvidia has been designing central processing units that would run Microsoft’s Windows operating system and use technology from Arm Holdings, Reuters has previously reported.

At Computex last year, Huang sparked “Jensanity” in Taiwan, as the public and media breathlessly followed the CEO, who was mobbed by attendees at the trade show.

During the company’s annual developer conference in March, Huang outlined how Nvidia would position itself to address the shift in computing needs from building large AI models to running applications based on them.

He made public several new generations of AI chips, including the Blackwell Ultra, which will be available later this year.

The company’s Rubin chips will be followed by Feynman processors, which are set to arrive in 2028.

Nvidia also launched a desktop version of its AI chips, called DGX Spark, targeting AI researchers. On Monday, Huang said the computer was in full production and would be ready in a “few weeks”.

Computex, expected to have 1,400 exhibitors, will be the first major gathering of computer and chip executives in Asia since U.S. President Donald Trump threatened to impose tariffs to push companies to increase production in the United States.

Source : Reuters

GLOBAL BUSINESS AND FINANCE MAGAZINE

Recent Posts

Our underappreciated international reserve system

The composition of international reserves is in a constant state of flux. This column identifies…

2 hours ago

CBDC neutrality, bank liquidity, and the hybrid nature of bank deposits

There are concerns that the widespread adoption of central bank digital currencies could drain bank…

2 hours ago

Beyond cost-cutting: How foundational process innovations drive sustained growth

Innovation is widely viewed as the engine of economic growth, but we know surprisingly little…

2 hours ago

Tall buildings lead to more compact and productive cities

Land-use regulations, including height limits, affect housing affordability and urban productivity. This column analyses over…

2 hours ago

Too fast to adjust: Adoption speed and the permanent cost of AI transitions

Most debate about AI and jobs still starts with the automation frontier: how many tasks…

2 hours ago

The EU’s new fiscal rules: First gaps between hopes and outcomes

The 2024 reform of the EU's Stability and Growth Pact introduced medium-term expenditure paths as…

2 hours ago