Business News Report Technology
June 14, 2023

AMD Unveils New Chipset for Generative AI to Rival Nvidia

In Brief

AMD unveiled its CPU and AI accelerator solutions at the Data Center and AI Technology Premiere on Tuesday.

The semiconductor company the introduction of the AMD Instinct MI300X accelerator, which it says is “the world’s most advanced accelerator for generative AI.”

Hugging Face announced it will optimize thousands of its models on AMD platforms.

At the “Data Center and AI Technology Premiere” on Tuesday, American semiconductor company Advanced Micro Devices (AMD) unveiled its CPU and AI accelerator solutions as it seeks to compete with Nvidia in the chip-making market.

AMD Unveils New Chipset for Generative AI to Rival Nvidia

This comes at a time when AI companies are facing limited access to GPUs amidst the generative AI frenzy. During the event, AMD made a series of announcements showcasing its AI Platform strategy. 

The company revealed new details of the AMD Instinct™ MI300 Series accelerator family, including the introduction of the AMD Instinct MI300X accelerator, which it claims is “the world’s most advanced accelerator for generative AI.” The MI300 could potentially rival Nvidia’s H100 chipset and even the GH200 Grace Hopper Superchip which is currently in production.

AMD Instinct MI300X vs Nvidia H100

Currently, Nvidia’s dominates the GPU market with over 80% market share. The H100 is the fastest GPU for AI, HPC, and data analytics workload:

  • The H100 is the first GPU to feature Nvidia’s fourth-generation Tensor Cores, which are designed to accelerate AI training and inference. The H100’s Tensor Cores are up to 7x faster than the previous generation for GPT-3 models.
  • The H100 also features Nvidia’s new Hopper Memory, which is a high-bandwidth, low-latency memory system designed to accelerate data-intensive workloads. Hopper Memory is up to 2x faster than the previous generation, making it the fastest GPU memory system available.
  • The H100 is the first GPU to support Nvidia’s new Grace CPU architecture, which is designed to accelerate exascale computing. The H100 and Grace CPU can work together to deliver up to 10x the performance of previous-generation systems.
  • The Nvidia H100 GPU has a memory capacity of 188GB, the highest memory capacity of any GPU currently on the market.

In terms of pricing, Nvidia’s chipset typically carries a price tag of approximately $10,000. However, the chipset has been listed on eBay for as high as $40,000. On the other hand, AMD has yet to put a price tag on its new chips.

With the upcoming release of AMD’s Instinct MI300X planned for later this year, the new memory accelerator could potentially topple Nvidia’s H100 chipset’s dominance in the market. According to a Reuters report, AWS’ cloud unit is considering using AMD’s new chips. 

The MI300X boasts promising specs, such as:

  • 192 GB of HBM3 memory to provide the computation and memory efficiency needed for large language model training and inference for generative AI workloads.
  • The large memory can accommodate large language models such as Falcon-40, a 40B parameter model on a single, MI300X accelerator

“AI is the defining technology shaping the next generation of computing and the largest strategic growth opportunity for AMD. We are laser focused on accelerating the deployment of AMD AI platforms at scale in the data center, led by the launch of our Instinct MI300 accelerators planned for later this year and the growing ecosystem of enterprise-ready AI software optimized for our hardware,” AMD Chair and CEO Dr. Lisa Su said in a statement.

AMD’s AI software platform 

Alongside the unveiling of the new chipset, AMD also presented the ROCm software ecosystem – a collection of software tools and resources – designed for data center accelerators. The company highlighted collaborations with industry leaders who joined AMD on stage for discussion. 

PyTorch, a popular AI framework, partnered with AMD and the PyTorch Foundation to integrate the ROCm software stack. This integration ensures that PyTorch 2.0 is immediately supported on all AMD Instinct accelerators, allowing developers to use a wide range of AI models powered by PyTorch on AMD accelerators.

Hugging Face, an open platform for AI builders, also announced plans to optimize thousands of their models specifically for AMD platforms.

 In May, AMD reported revenue of $5.4 billion for the first quarter of 2023, down 9% year-over-year. Following its announcement of its AI strategy, its stock is up more than 2% since the market opened today, currently trading at $127 as of writing this. Barclays, Jefferies and Wells Fargo have raised AMD’s target price to $140-$150.

Read more:

Tags:

Disclaimer

In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.

About The Author

Cindy is a journalist at Metaverse Post, covering topics related to web3, NFT, metaverse and AI, with a focus on interviews with Web3 industry players. She has spoken to over 30 C-level execs and counting, bringing their valuable insights to readers. Originally from Singapore, Cindy is now based in Tbilisi, Georgia. She holds a Bachelor's degree in Communications & Media Studies from the University of South Australia and has a decade of experience in journalism and writing. Get in touch with her via [email protected] with press pitches, announcements and interview opportunities.

More articles
Cindy Tan
Cindy Tan

Cindy is a journalist at Metaverse Post, covering topics related to web3, NFT, metaverse and AI, with a focus on interviews with Web3 industry players. She has spoken to over 30 C-level execs and counting, bringing their valuable insights to readers. Originally from Singapore, Cindy is now based in Tbilisi, Georgia. She holds a Bachelor's degree in Communications & Media Studies from the University of South Australia and has a decade of experience in journalism and writing. Get in touch with her via [email protected] with press pitches, announcements and interview opportunities.

Hot Stories
Join Our Newsletter.
Latest News

From Ripple to The Big Green DAO: How Cryptocurrency Projects Contribute to Charity

Let's explore initiatives harnessing the potential of digital currencies for charitable causes.

Know More

AlphaFold 3, Med-Gemini, and others: The Way AI Transforms Healthcare in 2024

AI manifests in various ways in healthcare, from uncovering new genetic correlations to empowering robotic surgical systems ...

Know More
Read More
Read more
Malta Embraces Crypto Future As Gate.MT CEO Highlights Next Wave Of Blockchain Evolution
News Report Technology
Malta Embraces Crypto Future As Gate.MT CEO Highlights Next Wave Of Blockchain Evolution
November 1, 2024
Binance Blockchain Week 2024 Ignites Dubai with Bold Visions for Web3, AI, and the Future of Crypto Innovation
Opinion Business Lifestyle Markets Technology
Binance Blockchain Week 2024 Ignites Dubai with Bold Visions for Web3, AI, and the Future of Crypto Innovation
November 1, 2024
Hedera Introduces Bonzo Finance Liquidity Layer To Catalyze DeFi Growth On Its Network
News Report Technology
Hedera Introduces Bonzo Finance Liquidity Layer To Catalyze DeFi Growth On Its Network
November 1, 2024
Layer 1 Blockchains or Layer 2 Solutions The Intense Debate Over the Future of Blockchain Scalability
Opinion Software Technology
Layer 1 Blockchains or Layer 2 Solutions The Intense Debate Over the Future of Blockchain Scalability
November 1, 2024