Supporting older operating systems costs money. It's not necessarily a push to spur sales of new systems so much as avoiding the cost of trying to make things work on old systems that they've already ceased supporting. Just as Windows 2000 support has ended, so will Windows XP support, and Vista support, and Windows 7 support, etc etc. Continuing to support the .NET framework on operating systems that are no longer supported in any other way does not seem prudent.
EDIT: To address the notion that since the CLR is the same for .NET 2.0 and the newer framework versions, the restriction was artificial. Although it is still working on the same CLR, that doesn't mean that all the support they've added will effectively work on Windows 2000. There are performance and hardware considerations to be made and I think considering the age of Windows 2000 and some of the more intensive features added to 3.0 and 3.5 frameworks, it was a reasonable decision to abandon WIndows 2k.
Whenever we as developers consider supporting a particular user-base, there has to be a consideration of the resources needed to add that additional user-base over the benefits of supporting them. Testing, bug fixing, and support costs have to be factored in. As Windows 2000 is no longer given any security updates, they would need to resurrect an update mechanism just for .NET updates. I suspect that the benefits do not outweigh the costs in this scenario. It therefore makes sense to me that Microsoft should artificially prevent newer frameworks from running on Windows 2000 as they are then saving themselves these additional costs.