Google raises new suggestion about rankings?

Google Engineer Matt Cutts in an interview has recently told how Google is going to rank web sites in the future, in particular he has hinted that ranking would somehow be affected by the loading time taken by a website.
So SEO won’t be the only “trick” on which a web site can rely; magic will be now diverted to different factors such as the real capacity of a webmaster (or web agency) to make a smart web site (in terms of code efficiency) as well as about a good hosting company with enough bandwidth and server ram (just to cite a couple of important factors).

Cutts revealed the existence of a new plug-in for Firefox, called Page Speed test, that will be integrated into the well known Firebug, which will add a new tab that aims to measure the page speed across a list of different factors.
As most of SEO and Webmaster out there, I performed some test as well, and I’ve been surprised to see in the issue list an item regarding the caching.

As you probably already know, most web pages include resources that change infrequently, such as CSS files, image files, JavaScript files, PDF, and so on. These resources take time to download over the network, which increases the time it takes to load a web page.

HTTP caching allows these resources to be saved (cached) somewhere by a browser (locally according to its settings) or a proxy, making the download faster. Once a resource is cached, a browser can refer to the first available local copy instead of download it again during the next session.
Using cache means with the lottery reducing round-trip time by eliminating numerous HTTP requests for the required resources, which substantially reduce the total payload size of the responses, and significantly reduce the bandwidth and hosting costs for your site.

So it seems caching is one aspect on which every web player should take care. Some problems occur if we think to the tons of web site that still doesn’t use server-side technology, which allow to control practically everything and set up the response header in a very easy way, or people whose don’t have direct access to the server configuration to set up the header in just a couple of clicks (that is not the META tag in the HTML Page like a friend of mine reported me today, despite it’s sometimes involved).

HTML Meta Tags and HTTP Headers

Just because I mentioned  them, it’s better clarifying my previous statement. It’s true that HTML authors can put Expire tag in the document’s head section, but for caching  purposes this meta tag is useless.

That’s because it’s only honored by a few browser caches (which actually read the HTML) and not by proxies (which almost never read the HTML in the document).

And if you are considering to use the Pragma, it won’t necessarily cause the page to be kept fresh.

On the other hand, true HTTP headers give you a lot of control over how both browser caches and proxies handle your web page. They can’t be seen in the HTML, and are usually automatically generated by the Web server.


The solution is…

I though to many different solutions to the problem, but none of them was directly applicable to a static web page, and it’s embedded resources like CSS and Javascript.

The only good thing I was able to find actually is a solution for external downloadable files like PDF, PPT or document in general, that may be eventually requested with a simple Javascript code that use an XMLHttpRequest whilst contextually modify the Response header. Unfortunately, this solution is valid only for internal links, because any external resource that points directly to the file will automatically use the header that has been set up by default on the server.

So, if this caching will really become an issue to pursue, it will be beneficial for all of you choose an hosting company or develop your web page using a server side technology that will allow you to change the headers in a very easy way.

UPDATE: After some days of frustration, I’ve gone through a real obsession for web server speed and I started to figure out how to solve the server caching issue above mentioned which you can find in my italian article.

Before to conclude, I just report another interesting paper released by Yahoo! on how to speed-up web pages.

Technorati Tags: http, header, caching, google

6 thoughts on “Google raises new suggestion about rankings?”

  1. I personally think people should keep in mind that Google uses over 200 factors in the crawl – rank – return process – the latest Matt Cutts pronouncement always needs to be seen in this bigger perspective (as I’m sure Matt would say if asked).

    Credit should be given to Yahoo – they came out with YSlow a while back:

  2. Hi Sean, and welcome on my blog.
    I really appreciate you have found some time to participate with a comment.

    I totally agree about the 200 (or more) factors; in fact I’m not saying the opposite or just "hey use this, and your web site rank". It was just a personal opinion about their new suggestion.

    You are right, possibly Yahoo! should be credited about the speed, but we must note that they generally don’t be so elastic while pronouncing key factor about their SE, and last time they did (meta keywords) they fall into a big mistake.

  3. @Sean, that’s right. I agree. Furthermore I think that if 10 webpages all have the same kind of ‘in page’ optiomization, same "number" of good "in-links"… the "loading speed" could be useful for seo.. and so after "content is the king" we could think about: "server is the prince" 😉

  4. I don’t think it will be a major factor, but like a lot of guidelines set by big g there is some sense to it, especially with the increasing use of the internet via modbile/usb dongles/gsm networks which have much slower connections than xdsl..

  5. Hi Chris,

    and welcome on my blog. I totally agree with you. It’s just one factor to be aware of rather than a big issue. In any case it’s not something to underestimate.

Comments are closed.