Summarising all of the interesting Serverless offerings from AWS might require more book than blog but here goes! fourTheorem spent a fruitful day at the AWS Loft in NYC hearing about all the cool new features being provided to customers.
Chris Munns kicked the day off with an overview of AWS Lambda and Serverless concepts. Chris is a New Yorker and a self-professed internet infrastructure geek who began working at AWS as a DevOps Business Development Manager, moving on to Senior Developer Advocate on the Serverless team.
Serverless architecture is a product of cloud computing. It is an event-driven application design and deployment paradigm in which computing resources are provided as scalable cloud services. With Serverless, you don’t have to deploy any virtual or physical servers, any containers, or any operating system. The cloud platform provider handles all this under the covers. Not having to fret about infrastructure frees us up to focus on coding our service(s) as a series of standalone functions triggered and run only when required.
AWS Lambda is a compute service that lets you run code without provisioning or managing servers. AWS Lambda executes your code only when needed and scales automatically, from a few requests per day to thousands per second. Lambda takes an event source such as an API endpoint request, or changes in a resource state, and triggers a function/functions. The Lambda function, in turn, outputs to other services which can be both internal and external to AWS. Lambda is a serverless Function-as-a-Service (FaaS), which is a sub-category of Function-as-a-Service, which in turn is a subcategory of event-driven computing.
The current languages supported by Lambda are: Node, Java, Python, C#, and Go. It offers flexibility, supporting synchronous and asynchronous mode, integrates seamlessly with other AWS services. and authorization with fine-grained control. It is stateless and persists data using external storage.
Eric Jonas, a postdoctoral research student at University of California-Berkeley’s AMPLab, has created a software package called PyWren to help researchers with their investigations. PyWren provides the ability to parse out Python-based scientific workloads across many different Lambda instances, in effect creating a giant, if extremely temporary, computing cluster.
The PyWren project has benchmarked 60-80 GB/s across 2800 Lambda functions, as well as 25TFLOPS (25 trillion floating-point operations per second) from a fleet of AWS Lambdas.
Waiting for a response from another service, for example, can create orchestration problems. In order to keep orchestration out of your Lambda code, AWS has developed the Step Functions service. This allows you to decompose your business workflow into separate objects, automatically triggering and tracking each step, retrying on error, allowing an application to execute in defined order. It also logs the state of each step so you can see when things go wrong. It is Serverless workflow management with zero admin
“Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. With a few clicks in the AWS Management Console, you can create an API that acts as a “front door” for applications to access data, business logic, or functionality from your back-end services. Amazon API Gateway handles all the tasks involved in accepting and processing up to hundreds of thousands of concurrent API calls, including traffic management, authorization and access control, monitoring, and API version management. Amazon API Gateway has no minimum fees or startup costs. You pay only for the API calls you receive and the amount of data transferred out.”
AWS offers a myriad of managed services that wrap around Lambda. Unless complex authentication systems are your thing, and is where your business derives value (aka revenue), why build such a system yourself? Why setup, maintain, and monitor database servers when there are plenty of cloud managed services out there? Why build out an API, when you can create one within AWS API Gateway?
Amazon’s API Gateway creates a unified API frontend for multiple micro-services. There are lots of other third party API gateway options to choose from, but it doesn’t make much sense to build and maintain these yourself within the AWS environment. Why not save yourself the hassle and build on top of Amazons hard work?
All this is great but you’re probably asking yourself “Where do I start?” There are many serverless frameworks you could use to get yourself going, including the following recommended examples:
The AWS Serverless Application Repository enables you to quickly deploy code samples, components, and complete applications for common use cases such as web and mobile back-ends, event and data processing, logging, monitoring, IoT, and more. Each application is packaged with AWS’s Serverless Application Model (SAM) template that defines the AWS resources required. SAM is a new serverless resource and sits on-top of CloudFormation, supporting anything that CloudFormation supports. It conforms to the Apache 2.0 Open specification.
Next up was Ronald Widha (Partner Solutions Manager) who led a session digging into the development process for serverless applications. He gave some useful hints and tips on the practicalities around developing for this platform.
Lambda functions accept the passing of variables, which can be useful for deploying to different environments (e.g. dev, testing, production).
The downside is Lambda aliases map back to a single function version and are immutable.
By default, an alias points to a single Lambda function version. When the alias is updated to point to a different function version, incoming request traffic instantly points to the updated version. This exposes that alias to any potential instabilities introduced by the new version. To minimize this impact, you can implement the routing-config parameter of the Lambda alias that allows you to point to two different versions of the Lambda function and dictate what percentage of incoming traffic is sent to which version. Alias Traffic Shifting can be used for implementing a canary deployment of AWS Lambda functions as well as routing and configuration using the AdditionalVersionWeight property.
Using Sam you have Globals and Safe Deployment functions. SAM can slowly switch over more and more traffic to a new version. If the canary function triggers too many predefined alarms, SAM can rollback. AWS CodeDeploy supports SAM and Lambda traffic shifting, thus enabling canary and blue/green styles of deployment.
Not an exhaustive list by any means, but some useful Lambda testing tools mentioned include:
For code/test coverage
API / UI testing
If you enjoyed this post you may also like - http://fourtheorem.com/blog/posts/2018/may/aws_summit_state_of_ai.html