News Report Technology
April 04, 2023

PyTorch 2.0 Release: A Major Update to the Machine Learning Framework

In Brief

PyTorch has released PyTorch 2.0, a major update to its open-source machine learning framework with new features and enhancements that make it more powerful and adaptable.

The update includes a high-performance Transformer API and support for training and inference using scaled dot product attention (SPDA).

PyTorch 2.0 Release: A Major Update to the Machine Learning Framework

PyTorch has announced the release of PyTorch 2.0, the open-source machine learning framework, which was highly anticipated by the data science community. The team delivered several new features and enhancements to the platform, increasing its potency and adaptability.

The framework is used for computer vision and natural language processing applications and is under the Linux Foundation umbrella. It provides tensor computing with GPU acceleration and deep neural networks built on automatic differentiation. Some deep learning software, such as Tesla Autopilot, Pyro, Transformers, PyTorch Lightning, and Catalyst, is built on top of PyTorch.

PyTorch 2.0 implements a new high-performance Transformer API, which aims to make training and deployment of state-of-the-art Transformer models more affordable. The release also includes high-performance support for training and inference using a custom kernel architecture for scaled dot product attention (SPDA).

At a similar time, PyTorch released OpenXLA and PyTorch/XLA 2.0. The combination of PyTorch and XLA provides a development stack that can support both model training and inference. This is possible because PyTorch is a popular choice in AI, and XLA has excellent compiler features. To improve this development stack, there will be investments in three main areas.

To train large models, PyTorch/XLA is investing in features such as mixed precision training, runtime performance, efficient model sharding, and faster data loading. Some of these features are already available, while others will be released later this year, leveraging the underlying OpenXLA compiler stack.

For model inference, PyTorch/XLA focuses on delivering competitive performance with Dynamo in the PyTorch 2.0 release. Additional inference-oriented features include model serving support, Dynamo for sharded large models, and quantization via Torch.Export and StableHLO.

In terms of ecosystem integration, PyTorch/XLA is expanding integration with Hugging Face and PyTorch Lightning so users can take advantage of upcoming features and the downstream OpenXLA features through familiar APIs. This includes support for FSDP in Hugging Face and Quantization in OpenXLA.

PyTorch/XLA is an open-source project, which means you can contribute to its development by reporting issues, submitting pull requests, and sending requests for comments (RFCs) on GitHub.

Read more:

Disclaimer

In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.

About The Author

Agne is a journalist who covers the latest trends and developments in the metaverse, AI, and Web3 industries for the Metaverse Post. Her passion for storytelling has led her to conduct numerous interviews with experts in these fields, always seeking to uncover exciting and engaging stories. Agne holds a Bachelor’s degree in literature and has an extensive background in writing about a wide range of topics including travel, art, and culture. She has also volunteered as an editor for the animal rights organization, where she helped raise awareness about animal welfare issues. Contact her on [email protected].

More articles
Agne Cimerman
Agne Cimerman

Agne is a journalist who covers the latest trends and developments in the metaverse, AI, and Web3 industries for the Metaverse Post. Her passion for storytelling has led her to conduct numerous interviews with experts in these fields, always seeking to uncover exciting and engaging stories. Agne holds a Bachelor’s degree in literature and has an extensive background in writing about a wide range of topics including travel, art, and culture. She has also volunteered as an editor for the animal rights organization, where she helped raise awareness about animal welfare issues. Contact her on [email protected].

Hot Stories
Join Our Newsletter.
Latest News

The Calm Before The Solana Storm: What Charts, Whales, And On-Chain Signals Are Saying Now

Solana has demonstrated strong performance, driven by increasing adoption, institutional interest, and key partnerships, while facing potential ...

Know More

Crypto In April 2025: Key Trends, Shifts, And What Comes Next

In April 2025, the crypto space focused on strengthening core infrastructure, with Ethereum preparing for the Pectra ...

Know More
Read More
Read more
SoSoValue Launches SoDEX Testnet And Opens Whitelist Registration
News Report Technology
SoSoValue Launches SoDEX Testnet And Opens Whitelist Registration
June 16, 2025
Sakana AI And MUFG Enter $34M Agreement To Automate Banking Document Generation
Business News Report Technology
Sakana AI And MUFG Enter $34M Agreement To Automate Banking Document Generation
June 16, 2025
Global Crypto Collaborations: Ripple, Bitget, and StealthEX Push Web3 Mainstream in June 2025
Digest Business Markets Technology
Global Crypto Collaborations: Ripple, Bitget, and StealthEX Push Web3 Mainstream in June 2025
June 16, 2025
ASIC Initiates Probe Into ASX Stock Exchange Over ‘Serious Failures’
Business Markets News Report Technology
ASIC Initiates Probe Into ASX Stock Exchange Over ‘Serious Failures’
June 16, 2025