views:

46

answers:

2

Hi,

I have used robots.txt to restrict one of the folders in my site. The folder consists of the sites in under construction. Google has indexed all those sites which are in testing phase. So I used robots.txt. I first submitted the site and robots.txt is enabled. Now the status is success for the www.mysite.com/robots.txt. But the google is still listing those test links. Here is the code i have written for robots.txt...

User-agent: *
Disallow: /foldername/

Can anyone suggest me what the problem may be? Thanks in advance

A: 
Mondain
Thanks Mondain......... I think robots.txt is downloaded successfully. On Test robots.txt tab, there are three columns like robots.txt file, Downloaded and Status and for each there is a message www.mysite.com/robots.txt, 17 hours ago, 200(success) respectively. I think crawl is done
A: 

See Requesting Removal of Content from our Index from the Google Webmaster Blog. You can expedite the removal process by submitting a removal request using the Google Webmaster Tools; otherwise, the pages will eventually be dropped out of the index when it is recrawled (i.e. updating a robots.txt file does not have an immediate change; the change takes place on subsequent crawls).

Michael Aaron Safyan
Thank you Michael, I used this too. My remove url is www.mysite.com/foldername/. The status is removed. But still not working.
I would wait just a bit... there may be some lag before it takes effect.
Michael Aaron Safyan