- 7 min to read
Pros and Cons of Serverless Architectures
A serverless architecture is a way to build and run applications and services without having to manage their infrastructure.
Technology waits for no one.
Just as you become experienced in using certain tools and architectures, something new is introduced and you’re left asking yourself if you now have to start all over again. It's a tough predicament that could throw you off as it can lead to large scale restructuring and code refactoring.
What if you have to finish a current migration only to do it all over again in a few years and with a new product? Have you found yourself in that scenario? It's no easy feat, and requires research and consideration.
At the end of the day, if you are able to deliver a great service to your customers, it probably won’t make much sense to consider such a significant change. In contrast, refusing to embrace innovation could keep you in the past -- and there are times when technology drastically leaps forward to a better and brighter future..
And this is a bit of what goes through our brain when we start seeing terms like "serverless architecture" everywhere we look.
If the thought of change gives you a headache, I suggest you consider this statement: when it comes to web architecture, composability is paramount.
Do you agree? If so, then serverless is the architecture you’ve been waiting for. If not, that’s okay, because either way we’re going to find out.
Personally I believe this to be true. When I think about how all my frontend applications are structured, and the success many organizations have enjoyed by moving away from a traditional Monolithic Architecture pattern to a decoupled Microservice Architecture pattern, I see a clear roadmap leading straight for Serverless Architecture. I hope by the end of this post you will see it too. Let’s go through what serverless architecture is and analyze the good, the bad and the ugly.
Join Bejamas newsletter
Sign up the newsletter today and receive valuable, in-depth Jamstack tips, tricks, and case studies.
What is a serverless architecture?
To begin, let’s get a clear understanding about what serverless architecture is not.
Serverless architecture is not a replacement for microservices nor does it mean there is no server. In fact, cloud service providers will take care of the server infrastructure for you and microservices architecture and serverless computing can work together, or be used separately, even within the same application.
It's very easy to blur the lines between the various cloud services making it difficult to determine exactly what constitutes serverless architecture.
Many believe serverless to just be Function-as-a-Service, or FaaS, and in its simplest form this is certainly correct.
FaaS is a subset of serverless, otherwise known as serverless functions. These functions are triggered by an event, such as when a user clicks a button. Cloud service providers will take care of the infrastructure these functions run on so literally all you have to do is write some code and deploy. Communication between your frontend and serverless function is as simple as an API call.
Serverless computing cloud services were first introduced by Amazon Web Services in 2014 with AWS Lambda: this is Amazon’s FaaS offering. Other popular cloud vendor FaaS include:
- Netlify Functions
- Vercel Serverless Functions
- Cloudflare Workers
- Google Cloud Functions
- Microsoft Azure Functions
- IBM Cloud Functions
Cloud vendors provide many other services that tend to be conflated with serverless:
- Infrastructure-as-a-Service (IaaS),
- Platform-as-a-Service (PaaS),
- Software-as-a-Service (SaaS),
- and Backend-as-a-Service (BaaS).
Now I’m not going to dive into detail about these technologies because they are each a topic on their own and this is more of a high-level overview of serverless. Instead I will simply say that the common characteristic of all of them is that cloud service providers take care of all the infrastructure elements of these services so that you don’t have to.
That means you save on time, resources, complexity and money while shifting your focus solely on your application and customer experience.
This was originally what the "A" in Jam(stack) stood for and is a fundamental advantage of the MACH (Microservices, API-first, Cloud-native, Headless) ecosystem of which Serverless Architecture is often a key component.
If you would like to learn more about this ecosystem, take a look at the MACH Alliance.
Why should you use a serverless architecture?
Serverless and associated cloud services are still relatively new and we are seeing incredible advancements with new technologies released into the market each year.
However, despite offering many advantages over traditional architectures, serverless computing is not a silver bullet.
As with anything, there are reasons for why serverless architecture may not meet your requirements. Let’s take a look.
Pros of Serverless Architecture
No server management. Serverless computing runs on servers, but those servers are managed by cloud service providers. Zero server management doesn’t limit you in any way either, just the opposite in fact thanks to scaling options, optimal availability and the elimination of idle capacity to name a few.
Cost. There are a number of ways in which serverless helps to save money. With traditional server architecture, you would have to predict and purchase server capacity, usually more than you need, to ensure your application does not face any performance bottlenecks or downtime. With serverless, cloud service providers only charge you for what you use since your code will only run once triggered by an event when a backend service is needed. I don’t know about you, but that sounds exactly right to me. Additionally, since the overhead and maintenance of servers is taken care of by your provider, you won’t be paying developers and other IT professionals to spend time on any of it. Just as cloud computing saves a fortune in hardware costs, serverless saves a fortune in human resource costs as well.
Scalability. Applications built on serverless architecture can be scaled endlessly and automatically. No worrying about a spike in traffic bringing down your site or causing poor performance as you may experience with a fixed server capacity. Costs will go up of course as your user base or usage increases, but as a previous employer of mine likes to say, that’s a nice problem to have.
Security. You may find that many articles on serverless architecture lists security as a disadvantage. This post will mention some of the most commonly shared security concerns in the next section, but it’s important to understand that top cloud vendors are dedicating themselves to providing the most secure, performant and available service possible. That is a key component of their business model so it stands to reason that they are employing some of the best in the business to create and maintain these services, not to mention ensuring they provide the absolute best practices. There is some security related to the application itself that developers still need to consider, but the bulk of it is handled for you by industry experts and in my opinion this is a major advantage.
Quicker time to market. Development environments are easier to set up and not having to manage servers leads to accelerated delivery and rapid deployment. This is especially critical for a Minimum Viable Product with the added bonus that everything is decoupled meaning you can add or remove services at will without the huge amount of code changes that a monolith application would require.
Reduced latency. Thanks to Content Delivery Networks (CDN) and Edge Networks, serverless functions can be run on servers that are closer to end users all over the globe. Popular examples of edge computing providers for the Jamstack include:
- Cloudflare Workers,
- AWS Lambda@Edge,
- Netlify Edge,
- and most recently Vercel Edge Functions.
Cons of Serverless Architecture
Vendor lock-in. While it is certainly possible to pick and choose services from different vendors, the easiest way to go will be to use a single cloud services provider, such as AWS, since each vendor will have their own way of doing things. This can be challenging if you ever want to migrate to a different provider and you will be completely reliant on vendors to provide an optimal service at all times. If there is an infrastructure problem of any kind, you have to wait for them to fix it.
Security. Cloud providers will often run code from several customers on the same server at any given time. They do this by using a technique known as Multitenancy. To put it plainly, customers are tenants that only have access to their share of the server. This presents a possible scenario where data is exposed due to misconfiguration of that server.
Performance impact. Serverless computing is not constantly running. When a function is called for the first time, it requires a “cold start” which is to say that a container needs to spin up before the function can be run. This may degrade performance although it’s important to note that a container will continue to run for a period of time after the API call is complete in case it is needed again soon after at which point we get a “warm start” without the added latency. Thanks to edge computing, cold starts are becoming less and less of an issue and this will only improve over time.
Debugging and testing. Debugging is complex due to reduced visibility of backend processes that are managed by the cloud provider. Additionally, a serverless environment can be difficult to replicate in order to perform integration tests. It’s not all bad news though. As the serverless ecosystem continues to grow, new platforms and services are released into the market to solve issues such as this. One possible solution would be to use Datadog’s End-to-end Serverless Monitoring.
Final thoughts
There are many use cases for serverless architecture. Most revolve around low-computing operations with unpredictable traffic or used in conjunction with microservices via REST or GraphQL APIs.
There is no question that migrating from a legacy infrastructure to serverless could be very challenging, especially if it requires a complete rethink of how your application is structured, but the beauty of switching to serverless is you can do it a piece at a time.
Maybe this relatively new architecture doesn’t fulfill all your needs right now, but investing your time in serverless will pay dividends in the end as we have only just scratched the surface of its capabilities.
Lastly, since Jamstack sites and applications are heavily focused on the frontend, serverless is the perfect method of integrating backend functionality.
Lack of experience with both of these architectures can certainly give a company pause, but we’re here to answer any questions you may have and ensure a smooth transition.
CLICK HERE to schedule a 1-on-1 talk and learn more about what we can do for you and your business.