It is very attractive to try to run Windows Server 2008 R2 with Hyper-V as your desktop / workstation OS, because then you can host test servers on the same machine. If you are developing for an x64 server environment, this is really something that you might think you want to do.
But there is a serious problem: Hyper-V causes certain specific video driver operations in the host OS to run much much much more slowly. And Windows Server 2008 R2, with any kind of remotely modern graphics card, will attempt to use accelerated video operations for lots of simple things like scrolling, moving, and resizing windows, and those operations will be much much much slower when Hyper-V is enabled. Not just a little slower - so much slower as to be very aggravating.
This is a very serious problem when attempting to use Windows Server 2008 R2 with Hyper-V as a desktop / workstation OS. Windows Server 2008 R2 with Hyper-V enabled really should be used ONLY as a server OS where the console is used very rarely, because the console's video performance will be horrific.
This is a problem that Microsoft is aware of, but which isn't very widely known. Here's a blog entry that has some information, and which links to other sources of information: http://blogs.msdn.com/virtual%5Fpc%5Fguy/archive/2009/08/21/hyper-v-versus-desktop-computing.aspx
(Windows Server 2008 (aka not R2) doesn't suffer to the same degree from this problem, because Windows Server 2008 does not use accelerated video operations for common tasks like window scrolling, moving, and resizing. The underlying problem is still present, but it doesn't get triggered as easily or as often, so it isn't as much of a hinderance.)