Edge computing is a distributed architecture that reduces latency by housing applications, data, and compute resources at locations geographically closer to end users.
Edge computing is often used in conjunction with the Internet of Things (IoT), but it is also beneficial for corporate workloads running on virtual machines or containers. When referring to IoT, the edge is known as a device edge. At StackPath, however, we deal with the “infrastructure edge” or “cloud edge” which is what will be discussed in this article.
Edge environments that support primary infrastructure are created through a network of data centers scattered across a nation or the globe. Each data center processes and stores data locally and is usually configured with the ability to replicate its data to other locations. The individual locations are called points of presence (PoPs) and generally include servers, routers, network switches, and other interfacing equipment.
Google Cloud, Microsoft Azure, and Amazon Web Services run on a cloud computing model which is designed to push data to a single, centralized location. These cloud data centers are large and located where land and power is the cheapest. These locations, by their very nature, may be located thousands of miles away from their primary users (i.e. end users). The physical distance between the cloud infrastructure location and the end user creates latency.
Edge locations, on the other hand, are strategically placed in city hubs to reduce this distance and, ultimately, the latency that end users experience. For example, data is able to travel to StackPath edge locations up to 2.6x faster than to cloud locations.
In StackPath’s edge computing environment, all the necessary networking, security, computing, and storage equipment for developing applications is available at 45 different edge locations around the world. Each site is connected by a private network backbone, allowing data to travel over long distances to other StackPath locations 21% faster than if it had to travel across the public Internet.
Deploying edge computing workloads is easy, especially if you’re familiar with setting up a content delivery network (CDN). The concept is the same. You’re distributing assets across the globe to reduce latency. The main difference is that, with edge computing, you’re distributing software and code instead of static assets, as you would with a CDN.
When you have your software and code, you can deploy as many VMs or container instances as you want to the cloud edge. You can also run code at the edge with serverless functions, a new offering from cloud and edge providers that doesn’t require developers to manage and update any underlying operating systems or software.
To see how edge computing works on StackPath, check out this support article.
Consumers have changed the way they receive entertainment. Currently, about 60 percent of all downstream traffic is video and consumers expect fast and smooth streaming. Putting the payload next door not only improves their experience but dramatically saves on bandwidth costs.
A key component for user satisfaction is serving the right type of content each time. Phones, computers, tablets, and TVs have different quality and format requirements. The edge is the natural place to process decisions that meet those requirements. Containers on the edge are better positioned to find the optimized solution for each media type.
For many years, consumers have been moving from cable to streaming. Net Insight, a global leader in streaming solutions, was able to create a true live streaming solution using containerized software and StackPath’s edge infrastructure.
Processing data with edge VMs is a way of distributing the computation workload outside the traditional data center. And with the exponential growth of IoT devices worldwide, edge computing will have plenty of work to do.
The vast Internet community is on pace to include 41.6 billion connected IoT devices by 2025, according to a forecast by International Data Corporation (IDC). And it will take a lot of computing power to reach all of these devices.
The fastest growth of IoT devices is taking place in the automotive and industrial categories, but IoT will continue to spread to consumer electronics as well. Extending compute to all these network resources will improve reliability as well as speed. Analytics that occurs in edge VMs can quickly provide critical information to IoT devices so that they can make snap decisions. Waiting for processing and instructions from some distant central server may result in costly, and even dangerous, delays.
Today, the digital advertising space is jam-packed with competitors, and advertising companies developing technology like real time bidding (RTB) platforms know that making their platform faster means beating the competition. One way that ad tech engineers improve the speed of RTB platforms is by optimizing a process referred to as the cookie sync. This is a trade of anonymous user identifiers between two domains that allows for better quality ads.
The code that makes this trade possible is lightweight and event-driven, making it a good use case for serverless. And because it’s highly sensitive to latency, it’s an especially good use case for edge serverless. After all, the engineers are attempting to make the “real time-ness” of the real time bidding platform as accurate as possible.
One company using edge serverless for cookie syncing is Future PLC. After choosing StackPath’s CDN to accelerate the delivery of ad creative, Reda Guermas, the company’s VP of Technology of Ad Tech, decided to test out StackPath’s edge serverless product as well. After seeing positive results, Guermas and his team did a full implementation. According to Guermas, “the faster the cookie syncing process is, the better it is for us from a bidding perspective because our matched users are better valued than unmatched users.”