Web caching provides significant benefits when the requested assets are served from a nearby server. The cache hit ratio depends on the effectiveness of the caching system and is influenced by factors such as the cache policy, the number of cacheable objects, the size of the cache memory, and the expiry time of the object.
An efficient cache policy maximizes the number of cache hits while minimizing the number of cache misses, leading to a higher cache hit ratio, lower latency, and better resource utilization. A cache hit ratio of 90% and higher means that most of the requests are satisfied by the cache. A value below 80% on static files indicates inefficient caching due to poor configuration.
Most cache servers have inbuilt tools that monitor metrics such as cache hits, cache misses, and cache hit ratio. For instance, in the StackPath customer portal you can easily see how your CDN cache is performing in terms of cache hits and misses. We also provide the cache hit ratio by making the following calculations:
Cache hit ratio = [Cache Hits / (Cache Hits + Cache Misses)] x 100 %
Facebook's users upload and view over 250 million photos per day. These photos are usually stored on Haystack machines that are optimized for photo storage. To deliver these photos, Facebook uses several photo-serving stacks with many layers of caches. In a study to determine the effectiveness of these caches, Facebook was able to determine that:
A high cache hit ratio greatly improves the browsing experience while reducing costs in terms of energy, bandwidth, and computation power. Therefore, it’s important to monitor the effectiveness of the caching system, then make adjustments to achieve a higher cache ratio.