AWS Advances ‘Broadest And Most Complete’ AI Strategy

At the re:Invent conference, AWS CEO Andy Jassy unveiled a plethora of services across the stack to make it easier for customers of all stripes to build AI into their business applications, looking to differentiate its cloud’s ML offerings with breadth and variety.

ARTICLE TITLE HERE

Amazon Web Services is looking to differentiate its artificial intelligence offerings in a highly competitive market through sheer breadth and diversity, as the cloud giant doesn't see a single product or platform satisfying the needs of most clients.

To that end, the cloud leader's CEO, Andy Jassy, dropped several new AI development platform features and machine learning-infused tools on Tuesday to round out the portfolio in a way that gives far more customers access to the powerful technology at all levels of the stack.

"Machine learning is not a single service," Jassy said in his keynote at the AWS re:Invent conference. That's why AWS wants to deliver the "broadest and most complete set of machine learning capabilities."

id
unit-1659132512259
type
Sponsored post

[Related: AWS CEO Andy Jassy’s ‘Very Ambitious Plan’ To Keep AWS On Top]

Amazon's focus on variety is the reason twice as many companies use AWS for machine learning than any other competitor, he said.

"Developers and data scientists and companies are so passionate and so excited to get value out of their data that they're willing to work with clunky tools," Jassy said.

The Amazon strategy is to make it much, much easier to leverage that technology—meeting customers where their capabilities stand by extending the already comprehensive portfolio.

At the bottom layer of the stack, where data scientists and experts do their work, AWS is introducing powerful instances in its P3 family. Those innovations have led to performance that bests previous benchmarks set by a "company in Mountain View," he said, not directly naming rival Google.

But unlike most other cloud providers, AWS doesn't want to funnel everybody through a single ML framework.

"We have a team that does nothing but work on optimizing TensorFlow performance on AWS," he said.

But talks with developers and data scientists have led AWS to conclude 90 percent of them use multiple frameworks beyond TensorFlow, most notably PyTorch and MXNet. AWS doesn't want to force those customers to port all their work to TensorFlow.

Moving up the stack, Amazon offers SageMaker, a product Jassy described as having introduced a "sea level" change in how developers build, train and deploy ML models.

That automated platform has been rapidly evolving, with more than 50 capabilities added in the last year, such as SageMaker Ground Truth and the AWS ML Marketplace.

But Jassy conceded that "all the work between SageMaker steps is still a lot harder than we wish." There's never been and end-to-end integrated developer environment in machine learning.

Amazon SageMaker Studio, which he introduced Tuesday, changes that with a fully integrated environment for developing machine learning.

"It's a giant leap forward," Jassy said of the new development environment.

SageMaker Studio is a web-based IDE when can store code, notebooks, datasets and make them all accessible from a single pane of glass, making it easier to manage all the pieces of building an ML model.

Jassy also introduced Amazon SageMaker Notebooks, which can spin up a Jupyter notebook with a click, as well as Amazon SageMaker Experience, which can automatically organize and search every step involved in building, training and tuning models.

One customer frustration in training machine learning models is their black-box nature when built on an automated platform.

Amazon SageMaker Debugger is a new service that lends visibility, sending metrics pertaining to their models that developers can monitor for all popular frameworks.

Another problem with AI is concept drift, which over time detracts from the performance of AI systems.

To remedy that, Jassy unveiled Amazon SageMaker Model Monitor, a service that creates baseline statistics on the day a model is trained, then analyzes all predictions going forward and compares them to data used to train models, allowing tweaks to restore performance of systems that have lost efficacy.

SageMaker Autopilot is another upgrade to the automated ML service that further automates training with no loss of visibility or control. It describes model performance within SageMaker Studio, and passes to customers a notebook "with the recipe" for that model.

For partners and customers who just want to use pre-trained tools at the top of the ML stack, AWS offers a number of services, from Polly, Rekognition, Transcribe, Translate.

Jassy expanded that lineup by introducing a fraud detection system powered by machine learning.

Amazon Fraud Detector learns from customer data, both fraudulent and legitimate. It then takes advantage of algorithms Amazon built internally over the last 20 years to guard its consumer business, and builds a model specific for the customer, exposed to them through APIs. The customer sends activity data back to Amazon, which runs the model and returns a fraud score.

Another AI-powered technology built internally at Amazon that's being exposed to customers is Amazon CodeGuru, a machine learning service to automate code reviews and identify the "most expensive" lines of code.

The first CodeGuru component performs code review based on millions of models Amazon has built for itself. Where there's a problem, it returns an assessment as "human readable comments" highlighting problematic lines of code.

The second component uses machine learning the "expensive" snippets of code that reduce latency and increase CPU utilization, Jassy said.

Amazon Connect, a call center service also built on internal Amazon technology, has been one of Amazon's fastest-growing services, Jassy said.

That product is also being boosted with AI through the launch of Contact Lens for Amazon Connect, delivering machine learning-powered analytics for the contact center.

And finally, Jassy introduced Amazon Kendra, "a new service that reinvents enterprise search with machine learning and natural language processing."

Available in preview, Kendra culls data sources with natural language processing to perform efficient and more-useful queries across the whole enterprise environment.

After Jassy's keynote, AWS partners specializing in AI were busy digesting the information to identify what new innovations would directly benefit their practices.

Richard Potter, CEO of Peak, a software and services provider based in Manchester, England, that focuses exclusively on AI, told CRN: "They see the market as we see the market."

"Amazon's strategy is the right one, helping people with the building blocks, because those building blocks are going to be used differently for most enterprises. Those building blocks are great, but to do something with them, they need to be brought together in a full stack, and that's what we're doing with our product," Potter said of his company's SaaS applications and services, built on the AWS platform, that allow customers from retail to financial services to adopt AI without hiring teams of data scientists.

Previously, the automated ML capabilities of SageMaker had been too limited for Peak to routinely take advantage of, Potter said. But Peak engineers are eager to test out new SageMaker features, especially the code debugger, to see if they can drive further development in its unique enterprise AI system.

"A lot of these technologies Amazon is building can slot in to our products and automate more tasks," he said. "That’s one of the compelling advantages of the Amazon cloud. It’s the breadth of services that cover everything you can imagine

Jonathan Bauer, principal for Deloitte Consulting and leader of the SI giant's AWS alliance, said the "whole ML story" is to "enhance the end-user experience; getting at information we couldn't previously get at."

SageMaker in particular is "trying to make this concept much easier for the users to take advantage of," Bauer told CRN, "and I think that's a significant move."

While some new ML features are for a limited subset of users, they are extremely valuable to those specialists.

For example, consider Transcribe Medical, a service revealed in the lead-up to re:Invent which caters speech-to-text capabilities to the medical community.

"To our ConvergeHEALTH team, they are all over it," Bauer said, "even though most of the room doesn't care."