Usually, we install VS.NET on our production server, to solve problems easily with our product, if necessary.
Is this a good or bad idea?
Usually, we install VS.NET on our production server, to solve problems easily with our product, if necessary.
Is this a good or bad idea?
Both. One of the upsides is that it can sometimes make diagnosing problems easier. One of the downsides is that sometimes the install can break a working web application. I would only do it if I had to.
Edit: another potential downside to this occurs when a co-worker decides to debug a live process on the production server, stops the app on a breakpoint, and then goes to bed without realizing he has left the app unavailable. Yes, this has happened to me.
Debugging and development should be done in a "safe" environment - something that is not mission critical. For example, you should have a development and/or QA server that you use for development and debugging.
EDIT: Your QA server should mirror your production server, so that you are able to debug in an environment similar, if not identical, to your production environment.
Depends on how you're using it. Most of the heavy work that it's designed for shouldn't be necessary on a production server. I usually install Notepad++ on the production server for editing xml, config files, etc. I'd say if you want to install VS, go for it.
The recommended way is to have a mirror of production server for testing/debugging purposes. To keep the mirror current you need to install all application updates on both servers at the same time and backup/restore production database on mirror nightly or on demand.
There are still some downsides like error that only happens under a heavy load. In this case you need some kind of logging to trace errors. Also you might need to purchase an additional license of third-party components to install on production mirror.
I would never do anything directly on the production server that would require Visual Studio. It's too risky to make a change on production that doesn't make it back into the code base and thus into version control. Eventually, you'll end up reintroducing a bug that you thought you had solved because you only changed it on the production server. Occasionally I will update mark up or XML files on the production server but only after having made the change in development and tested it on the QA box, and only when no actual code is involved.
Of course it depends on the product and what you're debugging. Generally speaking you should try to avoid it, but there are some cases where you can't exactly duplicate a scenario on a testing server and it might be useful to attach a debugger to a running process to quickly narrow down a problem. i.e. if you're running a MMORPG server, and there's an intermittent bug that occurs under certain load conditions, you can spend weeks or months trying to figure it out from log files and/or simulated connections on a testing server, or you can attach a debugger while the problem is occurring in realtime on the production server, and figure it out in an hour.
I would tend to treat this as an exceptional case though, and do as much debugging off of the production server as is possible and reasonable.
I would not think this is a good idea. Your code should have sufficient logging so that if there is an issue in production you can go back through the logs and determine what has happened and fix it in a development environment, then test it in a staging/uat environment before being pushed to production.
At the place I work developers are not allowed access to any production environment, that is handled by the server/network teams but that is because it is a large business. For smaller firms the developers will have access but it doesn't mean you should use it for debuging.
Have a look at Sasha Goldshtein's Production Debugging series on his blog. He has some great walkthroughs and screencasts about what can be done to debug without Visual Studio.
It really depends on whether you accept the risk that the installation brings with it:
You should also ask yourself what is most difficult with debugging the issue without VS on the server:
That is definitely not acceptable. First of all, for debugging you can always use the debugging tools for windows/windbg. Supports .NET debugging as well (SOS / son of strike), and with a cheat sheet is not that hard to use. Windbg can run from a USB stick without installation.
Second, and the most major cause: whenever you install any of the newer versions of VS, it registers its debugger for interactive mode** - you get a message box when an exception occurs. You need to manually edit the registry to revert back to default behavior after the installation - and nobody ever remembers that.
Don't do it. There are better ways for diagnosing issues.
It is generally considered not to be a "Best Practice" to install Visual Studio on production servers. This may introduce security risks - but one big concern I would have is performance. Visual Studio uses a tremendous amount of resources and running your debuging there can significantly impact perforance of production applications.