views:

40

answers:

2

I need dev and beta sites hosted on the same server as the production environment (let's let that fly for practical reasons).

To keep things simple, I can accept the same protections in place on both dev and beta -- basically don't let it get spidered, and put something short of user names and passwords in place to prevent everyone and their brother from gaining access (again, there's a need to be practical). I realize that many people would want different permissions on dev than on beta, but that's not part of the requirements here.

Using robots.txt file is a given, but then the question: should the additional host(s) (aka "subdomain(s)") be submitted to the Google Webmaster tools as an added preventive measure against inadvertent spidering? It should go without saying, but there will be no linking into the dev/beta sites directly, so you'd have to type in the address perfectly (with no augmentation by URL Rewrite or other assistance).

How could access be restricted to just our team? IP addresses won't work because of the various methods of internet access (meetings at lunch spots with wifi, etc.).

Perhaps having dev/beta and production INCLUDE a small file (or call a component) that looks for URL variable to be set (on the dev/beta sites) or does not look for the URL variable (on the production site). This way you could leave a different INCLUDE or component (named the same) on the respective sites, and the source would otherwise not require a change when it's moved from development to production.

I really want to avoid full-on user authentication at any level (app level or web server), and I realize that leaves things pretty open, but the goal is really just to prevent inadvertent browsing of pre-production sites.

+2  A: 

Usually I see web server based authentication with a single shared username and password for all users, this should be easy to set up. An interesting trick might be to check for a cookie instead, and then just have a better hidden page to set that cookie. You can remove that page when everyone's visited it, or implement authentication just for that file, or allow access to it just from the office and require people working from home to use VPN or visit the office if they clear their cookies.

jjrv
+1, I like the cookie idea to eliminate the logins (good for exec-like team members), but still use web server based authentication. Good idea. Thanks.
Chris Adragna
+1  A: 

I have absolutely no idea if this is the "proper" way to go about doing it, but for us we place all Dev and Beta sites on very high port numbers that crawlers/spiders/indexers never go to (in fact, I don't know of any off the top of my head that go beyond port 80 unless they're following a direct link).

We then have a reference index page listing all of the sites with links to their respective port numbers, with only that page being password-protected. For sites involving real money transactions or other sensitive data, we display a short red bar on top of the website explaining that it is just a demo server, on the very rare chance that someone would directly go to a Dev URL and Port #.

The index page is also on a non-standard (!= 80) port. But even if a crawler were to reach it, it wouldn't get past the password input and would never find the direct links to all the other ports.

That way your developers can access the pages with direct URLs and Ports, and they have a password-protected index for backup should they forget.

Pewpewarrows
+1, True that it doesn't sound "proper" but it sure is a simple method. I think there is a lot of activity by port scanners, though. The bots might not hit it, but it probably gets some hack attempts (our IIS logs show a lot of attempts to administer PHPmyAdmin, etc. -- even though we don't have PHP or MySQL flavors).
Chris Adragna
I agree. As far as I know the ports we use aren't used by any other mainstream product like phpMyAdmin, so unless someone is actively port-scanning they wouldn't know to look there.
Pewpewarrows