Sapeon Debuts AI Chip X330 for Data Centers Challenging Nvidia’s Dominance
To improve your local-language experience, sometimes we employ an auto-translation plugin. Please note auto-translation may not be accurate, so read original article for precise information.
Semiconductor startup Sapeon backed by South Korea’s SK Group introduced its newest artificial intelligence chip Sapeon X330.
Sapeon, a semiconductor startup supported by the South Korean conglomerate SK Group, has introduced its newest artificial intelligence chip Sapeon X330.
The chip has four times the computational performance and more than twice the power efficiency compared to its predecessor, the X220. With this innovation, the SK Group aims to establish itself as an alternative competitor to Nvidia‘s chipsets.
In 2020, SK Telecom pioneered the creation of the Sapeon X220, marking the inception of the first non-memory Korean semiconductor designed for data centers. The semiconductor excels at executing extensive calculations crucial for delivering AI services with both high speed and low power consumption.
After two years, the telecommunications company separated Sapeon into a separate entity, a move aimed at accelerating the commercialization of AI chips. Among the firm’s shareholders are SK Telecom Co., SK Hynix Inc., and the investment firm SK Square Co.
Before commencing mass production of the Sapeon X330 in the first half of 2024, the company plans to undergo testing for major customers.
The AI Chip Race is Getting Intense
According to Bloomberg’s recent report, with the advent of consumer generative AI tools such as Google’s Bard and OpenAI’s ChatGPT, the generative AI market is poised to grow up to $1.3 trillion over the next 10 years. Additionally, the rising demand for generative AI products could contribute approximately $280 billion of new software revenue.
Recently, Nvidia presented its upgraded flagship chip H200 with an increased memory capacity. The H200 chip, now equipped with 141 gigabytes of high-bandwidth memory surpasses its predecessor the H100, which had 80 gigabytes.
Nvidia currently dominates the market for AI chips, playing a crucial role in powering services like OpenAI’s ChatGPT, which exhibits human-like writing in response to queries. The company also announced that Amazon Web Services, Google Cloud, Microsoft Azure, and Oracle Cloud infrastructure will be among the initial cloud service providers to offer access to this enhanced chip.
In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.