Cloud computing has already had a profound effect on the way data centers are operated, but recent trends in the cloud sector could deliver another shake-up in the form of serverless computing, which promises to move customers a step closer towards utility-based computing.
Serverless computing is actually something of a misnomer, as it most definitely does not do away with servers. Rather, it takes away the need for the consumer of cloud computing to have to deal directly with servers, either in provisioning them or managing them, and instead focus on developing and deploying the business logic to power their own application or service.
This sounds a lot like PaaS, or platform-as-a-service, a long established and well understood cloud service model, but the serverless approach sees applications and services broken down into smaller, more discrete functions. Some serverless proponents have even coined the term functions-as-a-service (FaaS) to describe it.
The Amazon approach
The current craze for serverless computing can be traced back to Amazon’s introduction of the Lambda service into its Amazon Web Services (AWS) cloud portfolio in 2014. Lambda (see below) is an event-driven function that enables developers to create code that will run in response to some event or trigger.
However, a key aspect of Lambda is that it automatically manages the compute resources required to run and scale the code with high availability, and customers pay only for the CPU compute time consumed when the code is actually running.
As an example, Lambda could be used to drive a web server, and would consume little or no resources until kicked into life by an incoming request. AWS has published a reference architecture for just such an implementation on Github, along with a handful of others such as an Internet of Things Backend.
The traditional way of developing and operating a cloud-based service would be to provision enough servers and storage resources to run the code and handle the data, with the customer taking responsibility for provisioning the correct capacity, monitoring performance, and designing for fault tolerance and scalability.
In addition, the resources required to operate such a service typically have to be available continuously, ready to be used at any time, which means that the customer is paying for them regardless of usage, unless they develop their own system to provision resources on demand and release them when not required.
All of this means that building up a cloud-based infrastructure to deliver applications and services can prove to be a more complex and time-consuming task than the cloud providers care to admit, and this is part of the problem that serverless computing seeks to address.
“Serverless computing does make sense as the next level of cloud computing,” said Clive Longbottom of analyst firm Quocirca.
“I have commented before on how the likes of AWS and Azure are still reliant on having very clever systems architects in place that can architect the basic underlying resources before any software stack built on top is turned on.
Serverless computing represents another level of abstraction intended to hide the underlying infrastructure.
“A move to a platform where it is more based around desired outcomes means that we start to move to more of a capability to say ‘this is what I want to do – please make it happen’,” he added.
In other words, serverless computing represents another level of abstraction intended to hide the underlying infrastructure. And, while AWS is perhaps the most visible proponent of this approach, similar services are starting to be offered by other sources such as Google Cloud Functions, Windows Azure Functions or IBM OpenWhisk. One developer, Iron.io, has a serverless application platform called IronFunctions that is open source and can run on public or private clouds, and is compatible with AWS Lambda.
While serverless computing may have some advantages from the customer viewpoint, it could also deliver some benefits to service providers that implement such capabilities. If adoption of serverless computing functions such as AWS Lambda grows, then it could lead to fewer resources being tied up at any given moment in order to operate a customer’s cloud-based application, which could allow the service provider to cut back on the amount of spare capacity they need to keep available at all times.
However, to pull off this trick may require more sophisticated data center monitoring and orchestration tools, and predicting demand could become more complex if customers make greater use of functions such as Lambda that are able to automatically scale in order to meet peaks in demand
Serverless computing may have some advantages for customers, and it could also deliver some benefits to service providers. It could lead to fewer resources being tied up at any given moment in order to operate a customer’s cloud-based application
Meanwhile, serverless computing does not address one of the major pain points of cloud services for customers, according to Longbottom, which is being able to accurately forecast how much it is going to cost them to operate their applications and services in order to meet required levels of demand with an acceptable quality of service.
“It is really down to the customer to ensure that they understand how pricing could vary with usage, and this is one of the darkest areas with AWS,” he said, although the same applies for many other cloud providers.
“Although it publishes its charges very publicly, it is like saying that the cost to drive a car is easy to work out – it is based on miles per gallon, plus the wear of the tyres, which is dependent on the types of road being driven on, and so on. Serverless should hide some of that darkness – but only if the customer can get AWS to be very public in how it will charge, and on a very simple basis,” Longbottom added.
This is why enterprise customers prefer to negotiate contracts detailing in advance what capacity they require, and how much they are going to pay for it.
Then there is the old bugbear of vendor lock-in. With serverless computing based on proprietary functions, it may well prove difficult to migrate a service from one cloud provider to another, if a customer needs to do this.
“For those still providing their own code, then orchestration systems may well be able to deal with the vagaries of individual systems at the hardware level. But if a codeless (or code-light) approach is used, the customer is putting more faith in the service provider,” commented Longbottom.
“This is probably fine for smaller organizations where systems architects and great coders are often as rare as hen’s teeth. Here, the risks of creating a duff system heavily outweigh the risks of being over the barrel with a supplier.”
A version of this article appeared in the February/March issue of DCD Magazine