Edge Academy


An edge server is a server located at the network edge to reduce latency.


Edge servers play an important role in edge computing by providing the compute resources required to achieve the objective of reducing latency. There are two distinct types of edge servers: content delivery network (CDN) edge servers and edge compute servers.

CDN edge servers

A CDN edge server is a strategically placed server that provides users with cached versions of static content from origin servers. This means CDN edge servers provide content such as images, JavaScript, HTML, and downloadable content. As a result, CDN edge servers help reduce the workload on origin servers and reduce latency for users. These servers are deployed at points of presence (PoPs) and edge locations across a content delivery network.

Edge compute servers

Contrary to content delivery, edge compute servers provide compute resources at the network edge. Like CDN edge servers, edge compute servers are strategically deployed to reduce latency. However, as opposed to serving static web content, edge compute servers provide functionality such as data processing for IoT (Internet of Things) applications and 5G networks.

How CDN Edge Servers Work

Content caching plays a big part in how CDN edge servers work. CDN edge servers store cached versions of origin server content and serve them to clients whenever a request is made.

At a high-level, the process works as follows:

  1. A client makes a request for a web resource with static content (e.g. a video) by browsing to a given URI.
  2. The client request is sent to the website’s CDN.
  3. The CDN uses Anycast DNS to determine which CDN edge server is geographically closest and routes the request there.
  4. If the CDN edge server already has a copy of the content, it replies to the request directly. The result is significantly lower latency than if the request had to be routed to the origin server.
  5. If the CDN edge server didn’t have a copy of the content, it requests it from the origin server, proxies it to the client, and stores it in cache for future requests.

In the above scenario, there are three key benefits:

  • Reduced latency for clients: Once the content is cached, clients experience better performance and faster load times because they are requesting assets from a geographically closer server. Of course, the underlying network backbone can also impact latency. For example, requests sent entirely over the public Internet are often slower than those sent over StackPath’s private backbone.
  • Reduced workload on the origin server: Traffic spikes can wreak havoc on origin server performance. With most requests offloaded to CDN edge servers, origin servers are less likely to experience performance degradation when spikes occur.
  • Increased security for the origin server: As the CDN masks the origin server and proxies requests from clients, it can reduce the origin server’s exposure to DDoS attacks and other threats.

How Edge Compute Servers Work

Edge compute servers also work by reducing geographic distance and, in turn, latency between producers and consumers of data. However, in the case of edge compute servers, the data producers are often IoT devices and the servers are processing the data they create. For example, an edge compute server may sit between IoT devices on a factory floor and the cloud or an enterprise data center.

In that case, data flows like this:

  1. IoT devices (cameras, sensors) on the factory floor send data to an edge compute server (i.e. edge gateway).
  2. The edge compute server provides local processing, storage, and analysis of the data from the IoT devices.
  3. The edge compute server sends relevant processed data to the cloud or a corporate data center for further processing.

In this case, the benefits of the edge server include greatly reduced bandwidth to and from the cloud or corporate data center and reduced latency. Without the edge compute server, there’d be significantly more unprocessed data flowing directly to the cloud or data center, leading to slower processing times and enough latency to hamstring real-time data processing workflows.

Examples of Edge Servers

TeamSpeak’s use of StackPath’s CDN to improve download speeds by 68% provides a prime example of how CDN edge servers work in the real world.

Initially, TeamSpeak, a company that provides chat clients for eSports, had users download installers and patches from mirrors in Germany. Given the global nature of TeamSpeak’s audience, this approach often led to performance issues that impacted user experience.

TeamSpeak realized that using a CDN would greatly increase download speeds and create a better experience for the gamers using their platform. As opposed to having a global user-base download directly from servers in Germany, the StackPath CDN provided edge servers around the globe that stored and served downloads closer to users.

As for edge compute servers, this tutorial shows edge compute servers running container workloads in Atlanta, Dallas, Amsterdam, and Frankfurt for IoT data processing purposes as opposed to a single cloud server.

Key Takeaways

  • An edge server sits between two distinct networks.
  • Reduced latency and bandwidth consumption are the primary benefits of edge servers.
  • CDN edge servers provide low-latency delivery of static web content.
  • Edge compute servers provide low-latency data processing for use cases like IoT and 5G.
Related Products
Request Demo

Have questions? Chat now with the live chat button on this page or request a demo from our edge experts.

Request a Demo
Get an Advantage

Stay informed of the latest edge news, updates and solutions.