Sep 16 2022
Aug 31 2022
Resource
BlogTopic
Edge DeliveryDate
Jul 23, 2019This post was originally posted on the MaxCDN Blog and has been updated with the latest HTTP request tools.
HTTP headers can tell you a lot about the state of your CDN, web server and API. Not only do they describe the resources being fetched, but they give insight into how those resources were delivered. Knowing what to look for in a response can help you optimize content delivery and troubleshoot CDN-related issues. But to generate a response, first you need to send a request. This is where HTTP request tools come in handy.
While there are hundreds of tools available for generating HTTP responses, in this post we’ll cover those that are the most popular. Depending on your preferences and feature requirements, one tool will work better for you than another. But before reviewing the tools, let’s go over the headers that are specific to CDNs and caching. Familiarizing yourself with these headers will make test results more valuable to you as a CDN user.
HTTP headers carry critical information about the delivery of HTTP objects. In the context of CDNs, headers sent in response to a request from a user tells you about the performance and availability of a cacheable asset. This makes it easier to identify and troubleshoot problems that wouldn’t otherwise be apparent.
Let’s go over two of the more relevant headers for CDNs: X-Cache
and Cache-Control
.
X-Cache
X-Cache describes whether or not a resource was served from a CDN. If it was, the header reads HIT. If the resource was fetched from the origin server, the header reads MISS. This header determines your cache hit ratio and is affected by your caching policy.
Cache-Control
The Cache-Control header determines many of the actual caching mechanisms. It consists of several values that define how a resource should be cached. These values are usually set by the origin server and interpreted by any caching mechanism that receives the resource, including the user’s browser.
Some of the more common Cache-Control values are listed below.
max-age=
(time in seconds) — This tells the cache how long to store a resource before it expires. When the max age is reached, the cache requests a new copy from the origin. (This header overrides the previously used “Expires” header.)s-maxage
— The s-maxage header is identical to max-age except it’s designed specifically for proxies such as CDNs. Nginx doesn’t honor this header, which means it doesn’t apply to some CDNs. An alternative in this case would be to use X-Accel-Expires.must-revalidate
— This forces a cache to validate the “freshness” of a resource before delivering it. In some cases, a cache may deliver stale content based on the client’s configuration. If must-revalidate is set, the cache has to honor any lifespan limitations set on a resource regardless of the client’s configuration.proxy-revalidate
— proxy-revalidate is similar to must-revalidate, except it applies only to proxies.public
/ private
— These dictate whether or not a response is intended for a specific user. By default, responses are public. If set to private, the response won’t be cached by a proxy or CDN.no-cache
/ no-store
— no-cache prevents a resource from being cached, forcing the cache to revalidate the resource each time it’s requested. no-store prevents a cache from storing the response. A common use case with no-store is to prevent a cache from storing sensitive data such as login information or banking details.There are hundreds of tools available for generating and inspecting responses, including tools within browsers like Chrome and Firefox. Below we’ll cover these browser-specific tool as well as other tools that make header inspection easy, even easy to automate.
After you find an online request tool you like, it will quickly become your most frequented URL. This is because they’re easy to use and produce a clean output. Plus they’re free and full of functionality. While they don’t perform bulk requests like Request-as-a-Service (RaaS) tools, they master the act of displaying details of a single response in a streamlined way.
The ReqBin web app is separated into two columns (see below).
The first column is where you can select a request type and a URL. In the example below we send a GET request for an image published in an article by Techspot, a StackPath customer. Here we did not customize the HTTP request but you can do so to detect security problems and further test server performance.
The second column is where you can analyze the cache headers. For our example, we want to know whether the image is being delivered by the StackPath CDN. The HIT
value for the X-Cache
header indicates that it is.
Update: The original version of this article included other online tools but, upon further inspection, SoapUI has the best feature set and UX.
Need a break from the browser? Good news: Request tools that run from your desktop are designed with the dev in mind and offer more flexibility than their online counterparts. While slightly more difficult to use, there are less boundaries.
SoapUI is an open source desktop application. Despite its name, SoapUI supports both SOAP and REST-based services as well as a variety of web services and protocols.
With SoapUI, you can also build test cases. For example, imagine you want to test whether or not an image is reachable. You can do this in SoapUI by creating a TestSuite that contains one or more TestCases. Each TestCase contains multiple TestSteps that perform individual actions sequentially.
To validate the image’s availability, create an assertion. Then click the green plus icon next to the Endpoint and select Invalid HTTP Status Codes under Compliance, Status, and Standards. For the codes to assert, enter “404” and click OK. When you run the test, the icon next to the assertion will turn green if the image can be reached. This indicated the assertion is valid and that the test passed.
cURL is a command line tool to transfer data to or from a URL. A simple cURL command consists of the cURL command followed by a URL:
curl https://static.techspot.com/articles-info/1877/images/2019-07-16-image-2.jpg
By default, cURL returns the full body of the HTTP response. To display just the headers, include the -I
parameter:
curl -I https://static.techspot.com/articles-info/1877/images/2019-07-16-image-2.jpg
With the -I
parameter you’ll see an output similar to this:
HTTP/2 200
server: nginx
date: Wed, 17 Jul 2019 22:13:03 GMT
content-type: image/jpeg
content-length: 234952
last-modified: Wed, 17 Jul 2019 03:50:24 GMT
etag: "5d2e9b00-395c8"
cache-control: max-age=7776000
strict-transport-security: max-age=31536000
x-cache: HIT
accept-ranges: bytes
cURL supports additional features including password authentication, proxying, and file uploads. Because it’s a command-line tool, it can be easily inserted into a script for automated testing.
Knowing how to view response details in your preferred browser comes in handy when you come across an issue while browsing your website. Instead of launching an online or desktop tool, you can get the information you need directly from the browser. Also, these requests mimic a real user (you), rather than a synthetic system, thereby giving you more useful data.
Chromium, the open source project behind Chrome, provides a suite of utilities for capturing, tracking, and exporting HTTP requests. These tools are available to any Chromium-based browser including Chrome, Vivaldi, and newer versions of Opera.
To open DevTools in Chrome and Opera, click the icon in the top right section of the browser. Then select More Tools > Developer Tools.
When DevTools opens, a new tab appears showing the page source. Click on the “Network” tab to open the Network Inspector that shows a waterfall view of requests. The Network Inspector starts capturing traffic once you open it so asynchronous or delayed requests might start to appear. You can stop the capture at any time by pressing the Record button in the top-left corner, and you can clear the current request log by clicking the adjacent Clear button.
Clicking on a request reveals additional details including headers, message bodies, and the time taken to process the request.
You can export requests as an HTTP Archive (HAR) file by right-clicking the request and selecting “Save as HAR with Content.” A HAR file is a JSON-formatted log of a web browser’s complete interaction with a web page. This file can be imported into other debugging tools such as The Coach.
Firefox’s Network Monitor provides a comprehensive analysis and waterfall view of the most recent page load. To open the network monitor, click the icon in the top right section of the browser. Then select Web Developer > Network.
Once the Network Monitor is open, open a web page or press the Reload button to begin the test. You can also press the stopwatch button in the lower-right corner to monitor ongoing requests such as asynchronous calls.
The Network Monitor lists every request made along with its type, size, retrieval time, retrieval method, and return status. Clicking on a request opens a new tab that lets you investigate the request’s headers, body, SSL/TLS settings, and any cookies associated with the request.
The Network Monitor loads the site twice: once with an empty browser cache and again with a primed cache. You can then compare the performance of both tests side-by-side by clicking on the stopwatch icon. If you want to return to the Monitor, press the Back button on the left side of the screen.
As with Chrome, you can export any and all performance data as a HAR file by right-clicking a request and selecting Save All As HAR.
Microsoft Edge and Internet Explorer come with a development suite known as F12 tools. You can open F12 tools by pressing the F12 key or by selecting “F12 Developer Tools” from the menu. As with other browsers, you can also open the suite by right-clicking on a web page and selecting Inspect Element.
F12 opens in a new window with the DOM Explorer selected by default. Click on the Network tab at the top of the screen (or press Ctrl-4). Requests appear in the list as they arrive. The controls at the top of the window let you start or stop recording, export captured traffic as a HAR file, enable or disable local caching, clear existing entries, and filter the results.
Clicking on a request opens a side pane with detailed information about the request. You can view headers, body content, GET parameters supplied to the page, cookies and timing.
Safari’s built-in development tool is known as the Web Inspector. To enable the Web Inspector, first enable the Develop menu by opening Safari’s preferences and selecting “Show Develop menu in menu bar” in the Advanced pane. You can then open the Web Inspector by selecting it through the Develop menu in the toolbar, or by pressing Command-Option-I.
By default, the Web Inspector appears as a docked window at the bottom of the screen.
The default screen shows the currently loaded resources. To review network requests, click the Timelines icon at the top left of the window. The timeline shows a waterfall view of network requests, as well as layout and JavaScript rendering times (represented by the orange and purple bars respectively).
You can begin recording a timeline by refreshing the page, or by pressing the record button in the top-left corner of the window. Clicking on a request opens a side pane with the request’s contents and associated metadata.