Meta Unveils Audiocraft: AI Tool That Generates Audio and Music from Text Prompts
In a bid to catch up with its Big Tech peers in the AI arms race, Meta has been unveiling a slew of AI tools, the latest of which is Audiocraft, an AI tool that can generate audio and music from text prompts.
AudioCraft consists of three models: MusicGen, AudioGen and EnCodec. MusicGen was trained on roughly 400,000 recordings along with text description and metadata, amounting to 20,000 hours of music owned by Meta or licensed specifically for this purpose. It generates music from text prompts, while AudioGen, which was trained on public sound effects, generates audio from text prompts.
Today, Meta released an improved version of the EnCodec decoder, which allows higher-quality music generation. Simultaneously, the company is launching its pre-trained AudioGen models, enabling users to create an array of ambient sounds and auditory effects such as a dog’s bark, car horns, or footsteps on wooden surfaces. Additionally, Meta is making the complete set of AudioCraft model weights and code accessible to the public.
These models will be open-sourced, allowing researchers and practitioners to train their own models with their own datasets. According to Meta, The AudioCraft family of models is capable of delivering high-quality audio, while remaining user-friendly.
“We see the AudioCraft family of models as tools for musicians’ and sound designers’ professional toolboxes in that they can provide inspiration, help people quickly brainstorm, and iterate on their compositions in new ways,”
Meta wrote in a blog post.
AudioCraft serves as a unified platform encompassing music, sound, compression, and generation, all within a single framework. Individuals aiming to build better sound generators, compression algorithms, or music generators can do so within the same code base, building upon the foundation laid by others in the field.
Disclaimer
In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.
About The Author
Cindy is a journalist at Metaverse Post, covering topics related to web3, NFT, metaverse and AI, with a focus on interviews with Web3 industry players. She has spoken to over 30 C-level execs and counting, bringing their valuable insights to readers. Originally from Singapore, Cindy is now based in Tbilisi, Georgia. She holds a Bachelor's degree in Communications & Media Studies from the University of South Australia and has a decade of experience in journalism and writing. Get in touch with her via [email protected] with press pitches, announcements and interview opportunities.
More articlesCindy is a journalist at Metaverse Post, covering topics related to web3, NFT, metaverse and AI, with a focus on interviews with Web3 industry players. She has spoken to over 30 C-level execs and counting, bringing their valuable insights to readers. Originally from Singapore, Cindy is now based in Tbilisi, Georgia. She holds a Bachelor's degree in Communications & Media Studies from the University of South Australia and has a decade of experience in journalism and writing. Get in touch with her via [email protected] with press pitches, announcements and interview opportunities.