I have a situation where we have two code bases that need to stay intact..
example: http://mysite.com And a new site http://www.mysite.com
The old site (no WWW) supports some legacy code and has the rule:
User-agent: *
Disallow: /
But in the new version (with WWW) there is no robots.txt.
Is google looking to the old (no WWW) robots.txt file as its rule? And will adding
User-agent: *
Allow: /
to the (WWW) side override this?
Changing robots.txt on in the old codebase is not an option at this time.