Why Caching is a Vital Part of Website Performance

This is a guest post written by Roxana Elliott,

When it comes to website performance and speed, we have a saying over here at Cache is King. We believe that an effective cache setup is the number-one thing websites can do to serve content to visitors as quickly as possible, improve both front-end and back-end load times, and reduce stress on the website’s origin server. Below, we’ll go through where cached content can be stored and what types of content can be cached.

Browser Caching vs. Server-side Caching

There are two main locations where caches can store copies of images, HTML documents, and other elements that make up a web page:

1. On the visitor’s computer through use of a browser cache.

2. On a cache server that sits on top of the website’s origin server. This cache server can either be installed locally or through a content delivery solution, which will create caches in each of their locations throughout the globe.

With browser caching, when a visitor first goes to a web page their browser will store items such as logos, CSS files, and images for a set amount of time. The next time that same visitor goes to that web page, they will already have many of the items needed to make up the page; this means that there won’t be a need to make as many requests back to the website origin server, resulting in a faster page load time. Browser caching is useful for repeat visitors, but still requires each new visitor to fetch all assets from the website origin server initially.

By contrast, a server-side cache setup serves multiple visitors from the same cache without requiring them all to make requests from the origin server, significantly reducing load on the origin server so even the first view of a web page is sped up. Server-side caches are a type of reverse proxy, because they act on behalf of the website server and intercept and serve visitors before they reach the website origin.

When server-side caching is implemented, the first visitor to a web page after the cache is expired will request content from the origin server, which is then served to the cache and to the visitor. Subsequent visitors will be served the cached content directly–the more content on a web page that is cached, the faster the page load time will be.  

What Type of Content Can Be Cached

The type of content that can safely be cached through a server-side cache solution such as Varnish Cache, Nginx, or Squid has evolved over the years. Although many websites and content delivery solutions use caches primarily to store static objects such as images, modern solutions allow for the caching of dynamic content, partially dynamic pages, and even HTML documents, which are the first pieces of information a browser must receive to begin building a web page. Below are three categories of content in relation to caching:

  1. Items that are frequently cached:
    • Static images
    • Logos and brand assets
    • Stylesheets
    • Javascript that doesn’t change often (for example, Google tracking)
  2. Items that can be cached but are often not:
    • HTML documents
    • Javascript or other code that changes more frequently
    • API Calls
  3. Files that should not be cached:
    • User-specific data such as order history and account information
    • Any sensitive data

Of the above, it is the second category which can provide the most dramatic improvement in terms of website performance. Caching images and other static objects will certainly speed up page load time, but caching items such as full HTML documents is what can supercharge a website. HTML documents are especially important, as the download of the HTML document is a blocker to the loading of the rest of the page. When HTML documents are cached, back-end load time and the Time to First Byte are reduced because it is often fetching the HTML document from the origin server that delays the page loading.

By caching as much content as possible, websites see speed improvements that are even greater than those achieved by serving content from servers closer to the end-users. This is why believes that Cache is King, and enables websites to cache all sorts of content through a Content Delivery Grid that gives developers control over cache configuration and the ability to test changes in a local environment before going live.