The fallacy of validation

posted by Kelsey 06 September 2010

    Validating a website has become standard practice among designers over the last few years. As a site nears completion, the designer tests the site against the W3C Validator service to ensure that the code meets the standards, allowing them to proudly display a badge declaring their competency.

    I have no real problem with this, except insofar as it provides no value (except to the designer's ego!) to show this badge.

    The real problem lies with the claim that validation is inherently valuable as an SEO tactic. Sites like Google, Amazon and a plethora of other high-ranking sites stand as proof positive that validation alone will not ensure your site ranks highly.

    So why is validation touted as an SEO technique?

    The assumption that validation is required for positive SEO is an example of mistaken cause and effect.

    Keyword density provides an easy way to explain this. Google doesn't care about keyword density, it cares that the content on the page is relevant to the user's search. Keyword density is not really an efficient measure of relevance.

    If it was, a page that was simply stuffed full of keywords would rank more highly than a well-structured page that users spend time reading.

    However Google uses the bounce-rate as one of the metrics that determine relevance, meaning that if users immediately leave the page then it's deemed irrelevant to that particular search and it will be shown less frequently in future searches. Therefore naturally occurring keywords as part of good content are much more important than an artificial metric such as keyword density.

    validator buttonsThe same is true of code validation.

    I once used a template by a Joomla template developer that was a very close match to the initial concepts I'd produced, so I decided that the very cool extra features justified the cost and use of the theme. Even better, it validated out of the box.

    However, using the theme as a designer was an absolute nightmare - it was as if someone had taken the concept of nested-tables from the 90s and decided it was time to reproduce that horrible mess using modern code.

    Even worse, the actual content was buried within dozens of nested divs and in the view of search engines was placed dead last after every other text element on the page.

    screaming_manSo the template validated perfectly while still violating one of the core SEO rules regarding code structure

    Conversely, it only takes one minor issue that has zero effect on content structure for the site to fail validation. 

    Validation was never designed with SEO in mind. It's a measure of how well code stacks up against a very arbitrary set of standards. The idea was that every browser developer would stick to those standards and so validated sites would display identically across all browsers.

    At this, validation fails miserably and does not live up to its purpose.

    So why should we expect it to be a valid metric for SEO, a decidedly secondary purpose, when it can't even guarantee the primary outcome of its very reason for being?

    By itself, it's not a valid metric. It is reasonable to say that if a site passes validation it has well-formed code. If it has well-formed code the designer may be savvy enough to structure the content placement appropriately and promote good SEO practice. It's not necessarily the case, but it's often a good assumption.

    However It's not reasonable to assume that simply because a site doesn't validate that the structure can't support good SEO practice or that it's inherently bad for SEO. The code may be perfectly fine for SEO or it could be a disaster area: validation can't tell you that.

    A great example of validating sites utterly failing to provide positive SEO benefit is that of the Flash splash page. Those pages validate perfectly but convey no PageRank benefit whatsoever and actively impede the user (and search engines) from reaching actual content - something which most definitely does impact on PageRank.

    Sites will absolutely not be penalised on PageRank for validation issues alone. It takes much deeper problems with your content and the ability of search engines to view that content for PageRank to be affected.

    Validation won't help in the slightest if your content is irrelevant or hidden behind poor information structure.

    Focus on being relevant to your users and ensuring that search engines can easily traverse your site, that's how you'll build PageRank.

    About the author

    Kelsey

    Kelsey

    Kelsey Brookes is a professional designer, online strategist and writer.

    From the late 90s Kelsey managed the multimedia and film courses at the prestigious Computer Graphics College, Sydney and eventually founded the Melbourne chapter of the college.

    At the same time, Kelsey was a feature writer for Digital Media World magazine, interviewing subjects from the Australian and overseas film and production industries.

    Since 1999 Kelsey has managed thinksync, providing design, online strategy and marketing services to clients around Australia.

    E-mail: This e-mail address is being protected from spambots. You need JavaScript enabled to view it
    blog comments powered by Disqus