MongoDB Integrates Atlas Vector Search with AWS’ Amazon Bedrock to Boost Generative AI Models
MongoDB’s Atlas Vector Search integration with AWS’ Amazon Bedrock aims to accelerate development of applications powered by generative AI.
At AWS re:Invent 2023, cloud-native database MongoDB announced its plans to integrate MongoDB Atlas Vector Search with Amazon Bedrock, to aid the development of AI applications on Amazon Web Services (AWS), leveraging its renowned cloud infrastructure.
The collaboration aims to streamline the incorporation of generative AI and semantic search capabilities, enhancing user experiences and engagement.
MongoDB Atlas Vector Search utilizes operational data to integrate generative AI into applications, delivering customized end-user experiences. The integration with Amazon Bedrock is poised to empower developers, simplifying the creation of AWS applications utilizing generative AI for diverse use cases.
It will enable applications to provide up-to-date responses based on proprietary data processed by MongoDB Atlas Vector Search.
Unlike add-on solutions that solely store vector data, MongoDB Atlas Vector Search functions as a performant and scalable vector database. It integrates with a globally distributed operational database capable of storing and processing an organization’s entire dataset.
Through the integration with Amazon Bedrock, customers gain the ability to customize foundation models (FMs) privately, collaborating with AI21 Labs, Amazon, Anthropic, Cohere, Meta and Stability AI. The process involves incorporating proprietary data, converting it into vector embeddings, and leveraging MongoDB Atlas Vector Search to process these embeddings.
“While MongoDB Atlas Vector Search can work with many types of foundation models (FMs) from providers like OpenAI, Hugging Face, Microsoft Azure, Google Cloud, Anthropic and others — Amazon Bedrock provides a choice of high-performing, managed FMs that developers can use to convert proprietary data (images, text, video, etc.) into vectors so FMs like large language models can process them and provide responses to end-user requests,” Andrew Davidson, SVP of Product at MongoDB told Metaverse Post.
Turbocharging Generative AI Apps with Vector Search
MongoDB said the resultant applications, utilizing Agents for Amazon Bedrock’s retrieval augmented generation (RAG) — will be able to respond user’ queries with relevant and contextualized information without manual coding.
“Retrieval augmented generation (RAG) is now a common architecture pattern where organizations can provide proprietary data to foundation models (FMs) to customize responses to end-user requests so they are more personalized, accurate, and relevant,” MongoDB’s Davidson told Metaverse Post. “This reduces so-called hallucinations that FMs can be prone to and provides end users more trustworthy responses.”
For example, a retail apparel organization could develop a generative AI application to automate tasks such as processing real-time inventory requests or personalizing customer returns and exchanges by suggesting similar in-stock merchandise.
“By providing foundation models (FMs) with context from an organization’s proprietary data processed by MongoDB Atlas Vector Search, end users can receive more personalized and accurate responses to their requests,” MongoDB’s Davidson told Metaverse Post. “Because vectors are stored alongside metadata, operational data, time series data, geospatial data, and other types of data, Atlas Vector Search can perform more complex queries than bolt-on vector databases via a single API and query language.”
Organizations would also be able to deploy MongoDB Atlas across major cloud providers simultaneously for maximum availability and reliability, along with security and data privacy controls — critical for customers, especially those in regulated industries.
“Developers can easily evolve the data model rather than redesigning an entire data schema, which can take months of work and hold up deployment of new application features, including those that use generative AI, without having to worry about the expense of continually moving to larger and larger database clusters,” MongoDB’s Davidson added.
The integration of MongoDB Atlas Vector Search with Amazon Bedrock is expected to be available on AWS in the coming months.
In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.