AWS Bolsters Its Generative AI Portfolio

The AWS offerings can enhance many enterprise and IT functions, from developing code to simplifying visual reporting and more.

AWS Bolsters Its Generative AI Portfolio
(Credit: NicoElNino / Alamy Stock Photo)

Recent advancements in generative artificial intelligence (AI) have sparked widespread interest, but many businesses—ranging from startups to enterprises—find it challenging to leverage this technology effectively. The desire is to quickly access leading foundational models (FMs) to address various business problems. However, there isn’t a one-size-fits-all model. Some models interpret and produce text, while others interpret text and generate images. Even within these categories, the efficacy of models varies depending on the use case.

Amazon Web Services (AWS) recently launched new services and enhancements to existing tools to help organizations develop applications using generative AI by making the technology more accessible. A summary of these announcements follows and what they mean for businesses looking to adopt generative AI.

Amazon Bedrock: Building and Scaling AI Applications

Bedrock is a fully managed AWS service that is now generally available. It provides a selection of FMs from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon, using a single application programming interface (API). Think of it as a service that lets companies pick and choose different FMs to build AI apps without dealing with complicated setups or security risks. Amazon is offering a free digital course to help organizations learn how Bedrock works.

Finding the right AI models and using them in their apps can be a big hassle for organizations. Bedrock aims to make this process more straightforward, according to Vasi Philomin, vice president and general manager of generative AI at Amazon. It allows companies to experiment with different models and privately modify them using their unique data without writing any code. Since it’s a managed service, organizations don’t have to worry about overseeing any technical infrastructure. They can focus on integrating AI capabilities into their apps.

Bedrock is designed with privacy and security in mind, safeguarding sensitive data. Organizations can use AWS PrivateLink to create a secure connection between Bedrock and their virtual private cloud (VPC), ensuring no traffic is exposed to the public internet. Further, company data used for fine-tuning is not used to train the base model, ensuring that other companies using Bedrock, including competitors, can’t take advantage of the work of others. Additionally, Bedrock is suitable for companies in highly regulated industries since it complies with health information and general data protection laws.

Amazon Titan Embeddings and Llama 2: Expanding Model Selection

AWS is growing its selection of available Bedrock models, giving users more flexibility and choice to find AI models that suit their needs. One model, Titan Embeddings, converts text into numerical representations or embeddings. Titan Embeddings is helpful for companies interested in power searching and personalization tasks but might not have the resources to build such a model themselves. It’s a versatile tool that supports over 25 languages and can handle various text lengths.

“The embeddings model takes a piece of text and then converts it into a high-dimensional space. When you have an incoming query, it’s able to do the search in that space, which leads to much better answers. And so the model can answer confidently about things that are changing because it knows how to use the embeddings when responding to questions that are outside its knowledge,” said Philomin.

Moreover, in the coming weeks, Bedrock will offer Llama 2, Meta’s improved large language model, which has been trained using 40 percent more data and can work with larger documents. This addition is ideal for companies building dialogue-based applications. Llama 2 provides quick responses and eliminates the need for organizations to manage any setup or infrastructure.

Amazon encourages customers to “explore and experiment with different models and to identify the ones that align with business needs,” said Philomin. Several customers have reported the ease of switching models in Bedrock and adjusting them to their unique use cases.

Amazon CodeWhisperer: Customizing Coding Suggestions

CodeWhisperer has a new feature that allows developers to personalize CodeWhisperer’s suggestions using their organization’s private codebase. This AI-powered coding companion, trained on both Amazon and publicly available code, improves developer output by offering relevant tips for coding tasks. Integrating CodeWhisperer can be especially helpful when dealing with internal code that lacks extensive documentation or public resources.

To utilize this feature, an administrator connects to their private code repository, schedules a job to create their customization, and CodeWhisperer, using a mix of model- and context-customization techniques, learns from the repository to enhance its real-time code suggestions. This means developers can spend less time searching for answers to common problems and more time creating new experiences. Administrators manage all customizations centrally from the AWS Console. They can view evaluation metrics, estimate how each customization will perform, and deploy them selectively to specific developers across the company.

Additionally, this feature is built with enterprise-grade security and privacy in mind. The customizations are kept private, and the underlying FM powering CodeWhisperer does not use the customizations for training, thereby protecting customers’ intellectual property. The new feature will be available soon to customers in preview as a part of the new CodeWhisperer Enterprise Tier.

“This will unlock new levels of developer productivity in their organization because the generic CodeWhisperer can never see the internal libraries, the APIs, and the best practices of an organization in a safe and secure way,” said Philomin.

Amazon QuickSight: Simplifying Visual Reporting with AI

Amazon is introducing new generative business intelligence (BI) authoring capabilities in QuickSight, which are set to simplify the creation and customization of visuals for business analysts using natural language commands. Analysts can quickly create customizable visuals from fragments of questions and clarify the intent of a query by asking follow-up questions. The tool is particularly adept at refining visualizations and executing complex calculations.

Previously, to create a single chart, an analyst had to meticulously look for a correct data source, identify the data fields, set up filters, and make customizations. Now, analysts can describe the desired outcome, and QuickSight generates visually compelling and precise charts that can be added to a dashboard or report with a single click. Once the initial visualization is generated, analysts can add complex calculations, change chart types, and refine visuals using natural language prompts.

Customer Example: AI-based Knowledge Mining and Improved Services

In the pharmaceutical sector, there are manual, time-intensive processes related to the vast amount of internal knowledge and documentation produced by companies like Merck. The biopharmaceutical company is an example of an AWS customer that relies on Bedrock to assist with knowledge mining and market research. This helps Merck get a handle on patients’ needs, improve their health, and reach more people. It also ensures that AI is used responsibly, especially when accessing patient records and other sensitive information.

“Every company has internal knowledge and wants to enable its employees to find that knowledge very quickly. Then, there are contact centers with agents who need to know how to respond to customer issues. There's a pattern where getting the knowledge leads to being able to assist someone. That’s the case across multiple kinds of verticals,” said Philomin.

Another example is the PGA Tour, which is using Bedrock to change the way golf fans interact with the sport. The professional golf organization is reinventing the fan experience through generative AI. It’s also developing a platform that taps into Bedrock to show how players are performing and gives suggestions on adjusting their strategies depending on the player’s abilities and the course conditions that day.

The Takeaway

These announcements from AWS mark a pivotal step towards democratizing generative AI, allowing companies, regardless of their scale, to implement advanced models and develop applications that align with their needs.

Zeus Kerravala is the founder and principal analyst with ZK Research.

Read his other Network Computing articles here.

Related articles:

About the Author

Zeus Kerravala, Founder and Principal Analyst with ZK Research

Zeus Kerravala is the founder and principal analyst with ZK Research. He spent 10 years at Yankee Group and prior to that held a number of corporate IT positions. Kerravala is considered one of the top 10 IT analysts in the world by Apollo Research, which evaluated 3,960 technology analysts and their individual press coverage metrics.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights