Google Engineer Matt Cutts in an interview has recently told how Google is going to rank web sites in the future, in particular he has hinted that ranking would somehow be affected by the loading time taken by a website.
So SEO won’t be the only “trick” on which a web site can rely; magic will be now diverted to different factors such as the real capacity of a webmaster (or web agency) to make a smart web site (in terms of code efficiency) as well as about a good hosting company with enough bandwidth and server ram (just to cite a couple of important factors).
Cutts revealed the existence of a new plug-in for Firefox, called Page Speed test, that will be integrated into the well known Firebug, which will add a new tab that aims to measure the page speed across a list of different factors.
As most of SEO and Webmaster out there, I performed some test as well, and I’ve been surprised to see in the issue list an item regarding the caching.
HTTP caching allows these resources to be saved (cached) somewhere by a browser (locally according to its settings) or a proxy, making the download faster. Once a resource is cached, a browser can refer to the first available local copy instead of download it again during the next session.
Using cache means with the lottery reducing round-trip time by eliminating numerous HTTP requests for the required resources, which substantially reduce the total payload size of the responses, and significantly reduce the bandwidth and hosting costs for your site.
So it seems caching is one aspect on which every web player should take care. Some problems occur if we think to the tons of web site that still doesn’t use server-side technology, which allow to control practically everything and set up the response header in a very easy way, or people whose don’t have direct access to the server configuration to set up the header in just a couple of clicks (that is not the META tag in the HTML Page like a friend of mine reported me today, despite it’s sometimes involved).
HTML Meta Tags and HTTP Headers
Just because I mentioned them, it’s better clarifying my previous statement. It’s true that HTML authors can put Expire tag in the document’s head section, but for caching purposes this meta tag is useless.
That’s because it’s only honored by a few browser caches (which actually read the HTML) and not by proxies (which almost never read the HTML in the document).
And if you are considering to use the Pragma, it won’t necessarily cause the page to be kept fresh.
On the other hand, true HTTP headers give you a lot of control over how both browser caches and proxies handle your web page. They can’t be seen in the HTML, and are usually automatically generated by the Web server.
The solution is…
So, if this caching will really become an issue to pursue, it will be beneficial for all of you choose an hosting company or develop your web page using a server side technology that will allow you to change the headers in a very easy way.
UPDATE: After some days of frustration, I’ve gone through a real obsession for web server speed and I started to figure out how to solve the server caching issue above mentioned which you can find in my italian article.
Before to conclude, I just report another interesting paper released by Yahoo! on how to speed-up web pages.