![]() |
HTML Validation
How much, do you imagine, does the HTML validation of your pages effect SE bots and indexing. Or do you think that it plays no factor at all? I've hit a couple of small link lists to day where the little validation thing-a-ma-jig on my browser reported that it pretty much gave up due to so many page errors. I'd think shit like that would have some level effect on the bots.
|
It can have a lot of effect. Especially on a page as bad as the one you described. It's bad enough to screw yourself on the seo end but the chances of it being viewed as intended dramatically decrease if you do get any traffic.
|
Quote:
|
Quote:
to and things like that. Very hand tool to clean up your own pages. |
Thanks! :)
|
Yep, I validate every page and every css with the FF webdeveloper extension...very useful tool |thumb
|
It is my belief that extremely broken html will cause extreme problems.
What the browser does, and how a google bot reads a page are totally different. Google uses Python, so, we'll assume that they use Python for their bot. Python has an sgml parser which takes your page, dissects it into a tree structure, then goes to work on that. Things like An automated process will get confused with the above. Depending on how they are parsing, I would suspect you might lose the effect of the . Now, google probably goes to all lengths to make sure they can spider the web to the best of their ability, but, why gamble on that? |
Fot you IE users this is a great tool, I turn it off and on as I need it :)
Internet Explorer Developer Toolbar -- Explore and modify the document object model (DOM) of a Web page. -- Locate and select specific elements on a Web page through a variety of techniques. -- Selectively disable Internet Explorer settings. -- View HTML object class names, ID's, and details such as link paths, tab index values, and access keys. -- Outline tables, table cells, images, or selected tags. -- Validate HTML, CSS, WAI, and RSS Web feed links. -- Display image dimensions, file sizes, path information, and alternate (ALT) text. -- Immediately resize the browser window to a new resolution. -- Selectively clear the browser cache and saved cookies. Choose from all objects or those associated with a given domain. -- Choose direct links to W3C specification references, the Internet Explorer team weblog (blog), and other resources. -- Display a fully featured design ruler to help accurately align and measure objects on your pages. -- You can now selectively enable and disable CSS parsing. -- The Misc menu contains a color picker. -- Several link reports are available. -- When you select an element in the DOM element tree list, the selected element scrolls into view if it is not already visible in the browser window. |
Outstanding! Thanks MeatPounder. :)
|
Quote:
If you're asking SHOULD I validate my HTML pages, my answer is yeah why chance it? |
I do know that in the old days (LOL a few years ago) the original googlebot would definitely fall short if you had bad html like CD34 posted above - and validation was sometimes the difference between the ranking positions for some keywords.
Since that time, Google has two new bots that dont seem to be "as affected" but for the minimum time it takes to validate (something Im real bad about) it would seem like a silly thing not to do - let alone making sure that the html will render well for a firefox user vs an IE user |
All times are GMT -4. The time now is 02:57 PM. |
Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
© Greenguy Marketing Inc