views:

322

answers:

2

There's a way of excluding complete page(s) from google's indexing. But is there a way to specifically exclude certain part(s) of a web page from google's crawling? For example, exclude the side-bar which usually contains unrelated contents?

+1  A: 

If you're doing this for AdSense, here's an article on how to exclude content from the scraper. If you don't want Google to follow links, you can give them a rel="nofollow" attribute. Otherwise, I'm afraid you may be out of luck here.

Something else you could do, but I wouldn't necessarily recommend doing, is detecting the user agent before rendering your page, and if it's a spider or bot, not showing the portions of your page you want to exclude.

Jason
The cloaking is a very dangerous technique
Rinzi
hence the "but i wouldn't necessarily recommend doing" part of my statement :)
Jason
+2  A: 

You can include with an IFRAME tag the part of the page that you want hide at Googlebot and block the indexing of the file included from the robots.txt file.

add the iframe for include the side-bar in your page

<iframe src ="sidebar.asp" width="100%" height="300">
    </iframe>

here the rules to be added in the robots.txt file for block the spider

user-agent: *
disallow: sidebar.asp
Rinzi
this is generally a good mechanism, but may have downsides for normal users.
Jason
this does look good, however my sidebar is dynamic and is hard to be separated out.
bryantsai
Looks like unless Google explicitly support it, like the one supported by AdSense, this is the only way ...
bryantsai