You may be missing a few things from what you are being taught:
1) Commonalities. How close is the OO-code of Java, C++, C#, VB.Net? Do you notice some patterns in terms of how to change the syntax from one language to another? Do you see how one may do some things better than another, e.g. how Java using a virtual machine or the .Net intermediate language has both benefits and drawbacks? Do you see differences in how pointers are handled in each language?
2) Different paradigms. While you may be taking a few different OO or procedural languages, there are functional languages and scripting languages that could also be worth learning enough to see where they can be useful. There can also be some database and networking stuff that while one can use C# for some of it, I'm not sure I'd want to write my own database server software from scratch in C#.
3) Technology churn. While .Net is hot now, there was a time when Win32 was hot and then something before that and something before that. My point is that there will be new technologies and knowing how to pick up a new technology quickly is a great skill to have that I think you may be selling short in terms of getting diverse experiences.
4) Even if you take one language, you may not get all the way through it. Take C# and all the different ways it can be used, e.g. WinForms, WebForms, ASP.Net MVC, Console Applications, WPF, and WCF to name a few, and does anyone really know all of these things thoroughly? Ok, maybe Jon Skeet or Scott Gutherie, but are there mere mortals that know all of the powers of the language?