Follow Datanami:
December 1, 2016

AWS Reaches Out to the Edge in Vegas

Amazon Web Services scaled up its edge computing initiative with a series of new services unveiled at the AWS re:Invent conference in Las Vegas this week, including a new AWS Greengrass framework for off-network processing of IoT data, and an extension of the AWS Lambda service for edge computing.

You may know Amazon Web Services as the data center arm of retail behemoth Amazon (NASDAQ: AMZN) that literally rewrote the economic rules of cloud computing by building dozens of data centers around the world and installing between 2.8 million and 5.6 million servers in them, according to former EnterpriseTech editor Tim Pricket Morgan’s still-cogent 2014 analysis. (The number of servers AWS runs today is likely in the 6 to 12 billion range, if its server count is growing as quickly as revenues.)

To be sure, AWS makes billions by centralizing its customers’ disparate server farms onto its X64 computing infrastructure, called Elastic Cloud Compute (EC2), and doing the same for data with its Simple Storage Service (S3) object file system. On top of this core, it makes many millions more by offering a variety of value-add services, such as transactional databases (AWS RDB), Hadoop processing (Elastic MapReduce), NoSQL storage and processing (DynamoDB), stream data processing (Kinesis), data warehousing (Redshift), and enterprise search (Elasticsearch Service)—not to mention machine learning and deep learning via the new Amazon AI initiative unveiled yesterday.

The common theme among these services is that they all run within the friendly confines of your local AWS data center (or non-local data centers if high availability is your thing). But now, the company is looking to extend this huge computing mesh out into the physical world.

aws-greengrass

AWS CEO Andy Jassy introduced AWS Greengrass on November 30

To that end, yesterday it unveiled AWS Greengrass, a new software framework that will be embedded into mobile devices to enable other AWS applications developed under the AWS Lambda rubric to take advantage of those devices’ processing capabilities. Intel (NASDAQ: INTC) and Qualcomm (NASDAQ have already signed on to embed Greengrass into its next-gen mobile chips, and others are likely to follow suit.

“This is effectively a software capability that we’ll embed in devices, and it allows you to have Lambda inside those devices, so you have more compute and flexibility in the compute, with triggers and events you can drive off it,” said AWS CEO Andy Jassy. “It has all kinds of encryption [and] local data caching so you can continue to store the state of the device for other applications to interact with. Then there’s a local messaging component.”

As Jassy explained, it’s all about giving developers power to extend AWS functions out into the real world. “What they really want is the ability of these devices to have the same flexibility and programing model to do compute and analytics as they have in AWS,” he said during yesterday’s keynote address at AWS re:Invent. “It’s the same programing model. It’s very straightforward and simple.”

Potential use cases include equipping automated tractors with the local smarts to determine where to plant seeds, or giving smart home hubs the capability to function when there is not Internet connectivity. Smart home manufacturers “want to make sure that if they don’t have the right connection at any point to AWS, that people can still turn off lights, or turn heat on,” he said.

AWS Greengrass also plays in the company’s newly launched Snowball Edge, the suitcase-sized devices that customers can use to upload up to 100TB of data into the AWS. We briefly covered the Snowball Edge in our story about the Snowmobile, the new 45-foot trailer that AWS will park at customers’ data centers to slurp up 100PB of data at a time over fast fiber links.

lambda_edge-2

Amazon CTO Werner Vogels introduced AWS Lambda@Edge on December 1

Both of these new edge storage and processing devices also rely heavily on AWS Lambda, the event-driven, server-less computing framework that AWS debuted in 2014 to help developers stitch together data- and event-driven applications that execute in the cloud. AWS expects developers to be writing a ton of JavaScript code to drive processing out of the data center and into the fields and homes of customers around the world.

To help coordinate all those local and remote services in a distributed manner, the company is giving coders some additional tools, including a new edge-based version of the AWS Lambda service.

Called AWS Lambda@Edge, the framework will handle much of the low-level plumbing required by today’s increasingly complex edge solutions, freeing IoT developers to focus on writing software without worrying about properly chaining multiple functions or connecting microservices without breaking things.

“Coordinating these very large scale Lambda applications is a pain,” Amazon CTO Werner Vogels said in today’s AWS re:Invent keynote. “How do you scale that? How do you do error handling? How do you keep the state?…How do you step through such an execution environment to make sure that everything executes reliably in those systems?”

Those problems are all addressed by AWS Lambda@Edge, which is available as a preview today. “You can do processing at the edge location without having to go back to your original source,” Vogels said.

Related Items:

Exabytes Hit the Road with AWS Snowmobile

Amazon Adds AI, SQL to Analytics Arsenal

Datanami