The benefits of running serverless functions in the cloud can be outweighed by network-related performance issues and latency related to cold starts. But this does not need to be the case. There’s another option that allows you to reap the benefits of serverless without compromise: running your serverless functions at the network edge, similar to how content delivery networks (CDNs) deliver content at the network edge.
There are many use cases when using a serverless edge product like StackPath’s makes sense, especially for latency-sensitive applications. Below we’ll review some of these use cases that are in play by our customers and other developers.
Today, the digital advertising space is jam-packed with competitors, and advertising companies developing technology like real time bidding (RTB) platforms know that making their platform faster means beating the competition. One way that ad tech engineers improve the speed of RTB platforms is by optimizing a process referred to as the cookie sync. This is a trade of anonymous user identifiers between two domains that allows for better quality ads.
The code that makes this trade possible is lightweight and event-driven, making it a good use case for serverless. And because it’s highly sensitive to latency, it’s an especially good use case for edge serverless. After all, the engineers are attempting to make the “real time-ness” of the real time bidding platform as accurate as possible.
One company using edge serverless for cookie syncing is Future PLC. After choosing StackPath’s CDN to accelerate the delivery of ad creative, Reda Guermas, the company’s VP of Technology of Ad Tech, decided to test out StackPath’s edge serverless product as well. After seeing positive results, Guermas and his team did a full implementation. According to Guermas, “the faster the cookie syncing process is, the better it is for us from a bidding perspective because our matched users are better valued than unmatched users.”
RFC8484 defines DNS over HTTPS as a way to do DNS queries over HTTPS. There are many reasons why querying over HTTPS has benefits, one of which is that you can leverage existing HTTP technologies. One of those technologies is edge serverless which runs serverless scripts at the CDN edge, allowing logic execution to be much closer to users than the traditional cloud. Combining these two technologies enables you to act upon and respond to DNS queries at the edge around the world.
One of the main advantages of implementing DoH is that it secures DNS requests, further improving security for web applications. According to this ZDNet article, “while the classic DNS protocol makes this request in plaintext for everyone to see, DoH packages its DNS queries as encrypted HTTPS traffic.”
Serverless is a good way to respond to and process these requests with a few dozen lines of code. But the last thing developers want to do is sacrifice the speed of their application. DNS queries should only take a matter of milliseconds, but with a serverless cold start they could take a second or more. While there are ways to hack around cold starts by calling your serverless function every so often, it’s not ideal. A better way is to reduce latency at the network level by putting your function closer to the origin of DNS requests. Edge serverless makes this easy.
When responding to an HLS request, the streaming server determines which video quality (i.e. ts file) the client will attempt to play before switching to a lower or higher quality video. This switch depends on available bandwidth and device type.
Switching to the wrong quality first degrades the user experience considerably. For instance, sending a high quality file to a low-end device may cause the video/device to freeze, even on a good connection. And sending a low quality file to a high-end device with a good connection may cause the viewer to experience a low quality viewing experience for a prolonged period.
It may seem that sending a medium quality file first is a good solution, but it’s actually quite lazy. Instead, you can solve for the best solution in every case by using serverless scripts. Serverless allows you to optimize responses on a per-device, per-request basis without touching your origin server’s configuration or code. And, with edge serverless, you can reduce wait time and buffering by making the decision at the edge instead of in a far-off datacenter.
With expectations growing for on-demand, high quality video streaming, pushing as mush logic to the edge in an agile manner will help video delivery companies grow their audiences while reducing churn.