The ismypagecached testing tool can detect multiple layers of page caching, and that’s potentially a big time saver. But what are page caching layers, and why would developers or site owners look into that?
Modern web hosting technology and software give webmasters plenty of options for page caching, but this flexibility can also lead to misconfigurations that cause problems.
A typical issue that many people encounter is when they update their site content, but the users don’t see any changes. Chances are that old pages are unintentionally cached and still being served despite the recent changes.
Unfortunately, each caching mechanism tends to be unaware of other caching systems, and that’s why such situations are so common.
That’s when site owners and developers try to figure out what’s going on and what cache might need purging, which is active etc. That’s exactly why our tool was built.
More than one page cache per site
It is common for server technology to offer more than than a single page caching system, even though you should really avoid using more than one because it can cause page caching conflicts. There’s widespread misunderstanding of how caching mechanisms work, but we can hopefully help clarify all of this.
Note that when I use “server” I mean the hardware, and by “webserver” I mean Apache/NGINX/OLS, etc.
The best hosting services would let you install a page caching plugin, which runs within WordPress. The web server software (Apache/NGINX/OLS) might also have its own caching mechanism. Finally, your site might also cache pages on a content delivery network (CDN) such as Cloudflare, for example, using their APO service.
We call each of these page caching mechanisms “caching layers,” as they could be described as being on top of one another.
- Layer 0 (L0): CMS plugin caching (generally on disk)
- Layer 1 (L1): Web server (Apache/NGINX/OLS)
- Layer 2 (L2): CDN
It is possible that your pages are cached in all three locations simultaneously, making content purging a nightmare! Site editors will often not understand why their changes are not publicly visible, and they will call you the developer.
Normally, content should be purged in a specific order, starting with PHP, then Webserver, then CDN. However, people sometimes hit “purge Cloudflare” first, and before they can purge the Webserver cache, someone comes to the page and gets a stale version of the content from the webserver cache.
Layer 0: the CMS delivers the page
|Works on all hosting
|The slowest cache layer
|Easy to setup
There are many page caching plugins, and many of them write the cached paged to the disk. It is a sensible way of doing it, and it works on even the cheapest web hosting.
However, by default, the delivery is done by the plugin’s code, which means that the CMS engine needs to be executed to deliver the page. It is still better than no caching, but that’s not the most optimized way.
The best way for these plugins is to interface with Layer 1 and have the webserver (Apache/NGINX/OLS) deliver the cached content without invoking the CMS.
Layer 1: the webserver delivers the page
|The fastest on-server page caching
|Slightly more complex setup
Webservers are optimized to quickly deliver static content, and that’s exactly what cached pages are. Therefore, they are excellent at serving cached pages generated by the CMS.
That’s why it’s possible to configure the webserver to find and serve cached pages without even invoking the CMS code.
It works very simply: the webserver will typically look for a cached item based on its URL. If it can find it in the cache storage, it will serve it on the spot (we call this a cache hit). If not, it will invoke the CMS to create the content (cache miss).
Layer 1 is the fastest cache layer on a web server, but there’s an even better alternative… Layer 2!
Layer 2: there’s a world outside your server
|The absolute fastest caching
|Requires off-server orchestration to purge content
|Plugin support can be spotty.
|More difficult to setup
Page caching can also happen outside your server as well. Content delivery networks (CDNs) are typically used to cache images, CSS, and JS files, but they can be utilized to cache HTML content as well.
But the most important aspect of a CDN is that it is a network of cached stores, and any request is fulfilled by the closest node of that network. It means that the average Time To First Byte (TTFB) is drastically reduced for users physically far away from your site’s server.
The downside of CDNs is that they can be more complex to understand and configure, but it is worth learning about how to use them as they are key to near-infinite user concurrency scaling.
To avoid cache collisions, I recommend using only one caching layer and a higher-level layer if you can. When you use the tool and spot multiple caching mechanisms, just make sure that caching works at the higher levels.
Note that Cloudflare is often spotted as not caching HTML pages. It’s normal because, by default, it is set up to only cache images, CSS and JS.
In any case, page caching is very important, and any page caching is better than no page caching.