I didn't upgrade to Vista until May or so and one of the things I've always heard developers I know in real life say is "first thing you should do is turn off that UAC crap"
Well, I've left it on this whole time for a few reasons. First, just as a failsafe in case I do something idiotic like have a momentary lapse of reason and run an attachment from an email, or in case I view a site which hits some unpatched exploit. Second, as a big of an experiment to see how good or bad it really is.
Finally, I figure that it enforces some better practices. I used to develop every website in Windows directly in inetpub\wwwroot (Visual Studio .NET 2003 more or less required this) but now I develop them elsewhere because the UAC clickfest is a nightmare. I figure this is Microsoft's way of saying "you should really be doing it this way".
By way of another analogy - if you wrote a web app which runs on XP and 2000 just fine but requires 50 different security features of Server 2003 to be turned off, the real solution might be instead to just fix the application such that it doesn't require the security features to be turned off.
But now I'm having to work with an app which is really really NOT designed to be developed outside of inetpub/wwwroot and so UAC is really a nuisance. It's beyond the scope of the project to rectify this. I want to stick to my guns and leave UAC on but I'm also worried about being so autopilot about clicking "Yes" or "Allow" three times every time I need to modify a file.
Am I just being hard headed? Do most developers on Vista leave the UAC on or off? And for the instance described above, is there a better/easier way?