One reason we have Windows development platforms (even though our production is on Linux or Solaris) is common environment for all.
That means all the different populations involved in the realization of a softwares:
- are not all developers (business, functional people are also concerned with a working environment)
- are all on the same platform (Windows)
- use all the same tools to write/communicate (as in Word, PowerPoint)
- can have their same environment on laptop
In short: uniformity of environment for all (developers and non-developers alike).
The other reason is depreciation: it is easy to manage depreciation for PCs, where the services are lighter than a full-scale Unix server (like a Sun Fire, a F15K or F50K, ...): the latter needs some expensive assistance service contracts (like "bronze", "silver" or "gold" depending on the level needed). A PC is easier to fix/replace, and is not as critical is a developer "mess up" on it and crash it utterly ;)
That being said, the downside of this is you do not change PC every day: it means managing a large parc of desktops, you cannot just decide to upgrade like that (and that goes for Os too).
So where the other answers are all about "virtual machine" whereas your set of PCs is from 2003, with only 40Go of hard-drive and 1, may be 2Go of memory..., you realize "virtualization" is not always an obvious solution.
Hence, some Unix "integration" server are required for developers to test their products in an environment closer to the target. In a way, this is better, since those integration servers are managed in a uniformed way, avoiding the syndrome of "it works for meTM", as opposed to virtual machine, where each developer is the own root/administrator of one's own little world/server ;).