SEO Tip 10: Make sure its valid!
Search engines use web-bots to read the content of the page. These are computer programs that download web pages and try to make sense of them. These are often not as forgiving as browsers, so errors on a page can cause a bot to ignore that page (or parts of that page) altogether. Bots also don’t understand JavaScript, so a page that is accessible by visitors might not be accessible to a search engine.
So:
- Ensure that each page is fully functional if JavaScript is disabled in the browser. If navigation links disappear or stop working when JavaScript is disabled then you can be sure that those same navigation links will not be accessible to web-bots. This in turn can lead to parts of your site never being seen and thus have no possibility of being returned on a search engine results page.
- Ensure that each page validates with no errors – test the page against http://validator.w3.org/ and address every error or warning that it reports. Addressing every error or warning does not necessarily mean fixing it, although that would be the ideal. Do consider each error and warning and consider whether it is significant and whether you could do something to fix it. Be aware that some of the errors that the validator throws up can also prevent web-bots from fully extracting information from the page.
This tip is part of a series of tips on Search Engine Optimisation (SEO). The previous tip in the series was Tip 9: Be careful with those outbound links.