views:

82

answers:

7

I'm starting to focus on SEO for my site and a number of SEO tools have mentioned that valid CSS and Markup is something that search engines factor into rankings.

My site renders the way I want it to in browsers. However, since I'm using -moz-border-radius and -webkit-border-radius quite extensively, I'm getting a lot of errors when I run my CSS through the W3C validator.

How much will this impact my SEO efforts?

A: 

CSS only makes things look pretty. Search engines don't "see" pages, so invalid CSS won't even be seen by search engines.

Coronatus
+1  A: 

If it's just layout, you won't be affected.

Most spiders see your website as plain text (source code), and doesn't try to render anything.

Pierre 303
But they certsintly can (and do, from what I understand) crawl referenced CSS files, looking for shenanigans like hidden text.
Michael Petrotta
Yes probably, but I don't know any that render them.
Pierre 303
A: 

You may find these articles interesting:

Sarfraz
+2  A: 

Hi..
Front end with atmost accuracy may also be a criteria for search engines! If ur getting any errors in HTML or CSS try to see its encoding part or xml version and css version specified.!! But do correct those errors..
Some things that i want to tell about SEO's is:

  • Never ever customize a site for a Search Engine.
  • Develop ur site for the user, take care of things like the web page is displayed
    correctly in all web browsers.
  • Search Engine spiders work mainly in this direction.
  • ex: Google spider -> Make pages primarily for users, not for search engines. Don't deceive your users or > > present different content to search engines than you display to users, which is commonly > referred to as "cloaking."

Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether > you'd feel comfortable explaining what you've done to a website that competes with you. > Another useful test is to ask, "Does this help my users? Would I do this if search engines > didn't exist?"

All search engine spiders try to find a best page that works fine in all aspects!!
You cannot cheat google, give first importance to user automatically ur site will be in top.

Noddy Cha
+1. That's 100% true. Common sense. But still today, thousands of people are cheating with google and others with success. However, it's a professionnal job. Most "lambda" webmasters should not even try to do it.
Pierre 303
I thought this would be a great video for the ppl who think "SEO is all about following the 10 specifications listed on some site to fake Search spiders" .. Plz stop that! "You Cant Fake Search Bots" http://www.youtube.com/watch?v=uTPM3U3Cs9Y -> watch this video on "Get away of SEO madness!!"
Noddy Cha
A: 

HTML is the only major factor in your site and SEO. CSS does play a greater role than most people would realize.

Many designers may end up writing invalid or very hacky non-semantic markup to get a design working on their site. While this may be quite nice and find on the user (who doesn't typically give a ratte's arsch about semantic markup) it often causes a less than spectacular response in terms of your search engine rankings.

A secondary, albeit less realized way non-semantic or invalid markup can affect the end-user comes in the case of a user who uses a screen-reader. Basically the way you organize your page's code can affect the way the reader interprets things which under worst case circumstances may make your site unusable to anyone using a screen reader.

Remember:

  • Keep your markup semantic and valid
  • Don't use hacky code if and when possible
  • CSS doesn't directly affect SEO
Glenn Nelson
A: 

From the Search Engine Optimisation FAQ on SitePoint.

Does validation help your ranking?

Short answer: No.

Longer answer: No. But having a webpage that validates is a good idea. A webpage that has been validated to a W3C standard contains no errors and therefore can be easily parsed and understood by the search engine crawlers. An invalid webpage runs the risk of being misinterpreted or just not read at all.

EnderMB
A: 

Moe,

I'm developing crawlers(spiders). My spiders are never interested in CSS. If your site contains useful data for many, some people may write spiders to download data from your site. If you want to help crawlers downloading from your site (and visiting many pages on your site), you should focus on making data easily downloadable by crawlers. If a crawler can easily download data from your site then there might be more crawlers downloading from you. To achieve this, a useful tool is to make an easy-to-parse structure to your site.

Some good examples of easy-to-parse sites (I'm pasting the URL's of those sites): http://www.rightboat.com/search.php?Search=&SubmitSearch=Search

http://www.boatshed.com/dosearch.php?form_boattype=&form_currency=1&form_price=&form_length=&form_length_unit=m&form_motor=&form_catid=&form_hullcon=&form_fuel=&form_drive=&form_enginenum=&form_totalberths=&form_brand=&form_country=&form_boatref=&newboats=0

Best regards, Lajos Arpad.

Lajos Arpad