The container revolution has changed the way developers package and run applications. Containerized applications can be anywhere, at any scale. The only limiting factors are the speed of light and the capacity of the cloud. Edge computing offsets these limitations by providing dozens of strategically-placed points of presence (PoPs) that are always close to the end-user. In short, edge containers are a product of combining container technology with edge computing and have a variety of use cases.
For specific applications, speed is everything. When decisions have to be taken quickly, edge computing is the answer. Edge containers are continually running and ready to accept requests. There are no cold startup issues and containers are always only a few hops away, making communication fast and reliable.
Switching from using containers managed by a public cloud to an edge compute provider can reduce latency by almost half a second. In a computer scale, this is an eternity. It is like the difference between having a conversation over a satellite link—with its awkward silences and half-starts—and speaking to someone in the next room.
Low latency applications include Virtual Reality (VR), Augmented Reality (AR), mobile applications, autonomous cars, healthcare, real-time communication, multimedia streaming, online trading, banking, and e-commerce. When immediate cloud processing is required, edge containers are the best alternative.
Online games are yet another salient use case for edge containers. Latency can literally kill players within the game. And lag, as annoying as it is, is only the tip of the iceberg. There is also the issue of scale. Multiplayer games struggle at peak hours when tens of thousands of players gather together. Edge containers, however, are easy to scale as demand grows and wanes in each region while keeping latency in check.
Edgegap, a disruptive gaming and edge compute startup, successfully positioned itself as a leader in the market by helping game studios kill lag by 58% with edge containers. Shortly after integrating StackPath’s edge containers, Edgegap presented a proof of concept to a major AAA video game studio and the results were thoroughly convincing.
At least 3.6 billon IoT devices connect every day to the Internet. That number is only going to rise in the next few years and all those smart devices depend on constant connectivity and generate an ocean of data. Networks are already congested and public clouds are struggling to keep up. Decentralized edge networks make it possible to distribute data evenly. In seconds, organizations can spin up as many containers as required so all that data is stored and processed as close as possible to the end-user.
Another lever pushing for more decentralization is mobile. It is easy to forget that a smartphone is just a small computer that moves continually and depends on the network to work. The general availability of 5G networks since 2018 has only put more strain on the cloud. Edge containers bring services close to where phones are and services running at the edge have a competitive advantage over those running entirely in public clouds.
Consumers have changed the way they receive entertainment. Currently, about 60 percent of all downstream traffic is video and consumers expect fast and smooth streaming. Putting the payload next door not only improves their experience but dramatically saves on bandwidth costs.
A key component for user satisfaction is serving the right type of content each time. Phones, computers, tablets, and TVs have different quality and format requirements. The edge is the natural place to process decisions that meet those requirements. Containers on the edge are better positioned to find the optimized solution for each media type.
For many years, consumers have been moving from cable to streaming. Net Insight, a global leader in streaming solutions, was able to create a true live streaming solution using containerized software and StackPath’s edge infrastructure.
Edge containers should be considered when large data pools need to be processed instantly. The general availability of containers for distributed computing and streaming platforms makes it easy to create data processing pipelines in real time. Live analytics uncover more significant insights that lead to better decisions. Errors and sudden trends can be detected quickly as well.