Let's play a thought experiment and flip this around.
Suppose your university taught you the very latest technologies, methodologies, and tools -- the exact set of things that employers are looking for today. Where would this lead?
First, you'd be making a huge leap of faith that the things you learned in freshman year would still be marketable by the time you graduated three or four years later. Software technologies move very quickly. If you were taught, say, jQuery today, would that specific skill be useful in four years? To put this in perspective: was it useful four years ago?
How could your university possibly predict what will be the hottest technology at the time you graduate?
Second, let's say by some miracle both you and your university have timed things perfectly and you graduate with a set of skills and technologies that allow you to get a terrific first job right out of the gate. Because of your up-to-date skills, you can hit the ground running. A few years later, you start looking for another job. Will the skills you used in your first job be the same ones your next employer is seeking?
Maybe, but most likely not. And it's not just because technology marches relentlessly forward. Employers have different needs, different methodologies. Some use the latest tools and languages, others use ones that were popular ten or more years ago, and still others use ones that you've probably never heard of. Now you might be really steamed at your university.
Would you be willing to go back to school to learn the old stuff that you and your university disregarded and the new things that have come out since you graduated?
Or, as I suspect you might be thinking now, would you have rather learned that:
Despite their significant syntactic ("surface") differences, most programming languages in professional use are pretty much the same. The words and symbols are different, but the concepts are similar. If you learn two or three of them in school, you'll see the similarities among them and can apply that knowledge to the new ones that will become popular in the future.
Basic algorithms and data structures haven't changed in twenty years, so it doesn't matter what specific language you use to study them. If you study them using the universal language of mathematics, you'll be well prepared to implement them in any other language of your choosing.
New technologies are never created out of thin air. They are always based on what came before. To be able to create something new, you must have a foundation in prior art. For example, to really understand Haskell (one of the hot languages du jour), you need to know about ML. But to understand ML, you probably need to study Lisp and Scheme. (Is Scheme the language your university used?) To really understand how Lisp/Scheme works, you'll need some assembly experience.
My point is this: if all you know is "new" things, you might not be able to learn yet newer things. But if you know about what came before, you can see how we got to where we are today and you'll be in a much better position to learn about where we're going tomorrow.