AWS Announces Powerful New Offerings to Accelerate Generative AI Innovation
While the seeds for the machine learning paradigm shift have existed for several decades, it is only recent advancements in ML technologies that have allowed organizations to transform their business. ML technologies help power generative AI applications, which have captured everyone’s imagination and organizations are looking for innovative ways to use this technology to reimagine how they solve problems and create new user experiences.
Amazon Web Services (AWS), one of the world’s leading cloud platforms, announced generative AI innovations to help organizations build new AI applications, enhance employee productivity, and transform their business. It will also help simplify development while maintaining security.
A key component of the announcement is the general availability of Amazon Bedrock, which was introduced in April 2023. Amazon Bedrock is designed to be a fully managed service that makes pre-trained foundation models from leading AI companies accessible through a single API. It offers serverless integration with AWS capabilities, allowing organizations to find the right tools and customer them according to their needs.
Swami Sivasubramanian, vice president of Data and AI at AWS, shared his views on the announcements, “With enterprise-grade security and privacy, a choice of leading FMs, a data-first approach, and our high-performance, cost-effective infrastructure, organizations trust AWS to power their businesses with generative AI solutions at every layer of the stack. Today’s announcement is a major milestone that puts generative AI at the fingertips of every business, from startups to enterprises, and every employee, from developers to data analysts. With powerful, new innovations, AWS is bringing greater security, choice, and performance to customers, while also helping them to tightly align their data strategy across their organization, so they can make the most of the transformative potential of generative AI.”
AWS also announced that Amazon Tital embedding models are now generally available and that Meta’s Llama 2 will also be available as a new model. This makes Amazon Bedrock the first platform to offer a fully managed service to access Llama 2 through an API. The Titan foundation models include a large natural language model for text generation tasks and an embedding model to help with personalized recommendations. The combination of other leading foundation models will further enhance the capabilities.
The goal of having multiple foundation models available is to allow customers to discover what works best for them. Amazon Bedrock will include foundation models from Cohere, Meta, AI21 Labs, Stability AI, and Anthropic, all via a single API.
In the announcement, AWS also introduced a new capability for Amazon CodeWhisperer. This is AWS’s AI-power coding companion. Developers can customize the CodeWhisperer suggestions based on their existing codeback. This will help boost developer productivity. There is a professional tier for the CodeWhisperer for business users that need advanced features such as access management integration and single sign-on (SSO) with AWS identity.
AWS is also releasing a preview of Generic Business Intelligence (BI) to help boost business analytics productivity. This new feature will offer authoring capabilities for Amazon QuickSight, which is a cloud-powered business analytics service that can create visuals and perform calculations by simply describing what is needed in natural language.
With the announcement of several powerful new offerings to accelerate AI innovation, AWS further reinforces its position as one of the leading platforms to host generation AI apps. Amazon has already invested heavily in creating generation AI solutions. Earlier this year, it announced a $100 million program called the AWS Generative AI Innovation Center to build and deploy AI solutions. With the generative AI market expected to continue growing, there are endless possibilities for what this technology can do.