Practicality vs. Theory would be high on my agenda.
I would talk to them about how to write descriptions, draw diagrams on whiteboards, and file bugs/questions.
All the rest is just typing ;)
Seriously, the ability to express oneself coherently makes all the difference in real world software engineering. If you can write clean and concise design documents, judge the level of documentation required for an API to make it easy to use without being daunting, and get answers on sites like this one quickly, you'll be streaks ahead of those who can't string two sentences together.
To generalize some of the other questions, I'd teach less of Computer Science, and more of Software Engineering. Most of these CS majors will be doing Software Engineering for most of their careers, and it would be nice if the colleges got them started (and didn't wait for Business to do so).
More focus on process that is applicable in today's world. Covering waterfall in great depth is likely not as useful as it once was.
It's important to actually develop functioning applications aside from class projects, in whatever domain interests you. Develop a mobile application, a web-based game, a music player, or whatever else you find appealing. This will allow you to use source control and to begin to learn how to organize a larger code base.
I'd like to impress upon them that they don't know everything, neither do I, and none of us ever will. In short, your code sucks, my code sucks, and that is never going to change. Once they get off their high horse and realize their inefficiencies while kicking their ego out the door, development will be so much easier for everyone involved.
Near the end of their studies, I would have them on a year long course building together a big application with two releases.
The teacher would act as lead developer, guiding the design, teaching how to use version control, issue trackers, how to polish an application for release, etc.
Not that I think any university would actually do such a thing. I know I would've liked it.
The contents of great books such as The Mythical Man Month, The Pragmatic Programmer and Clean Code. Those were more useful to me than half of the subjects in CS.
Debugging (for a first course in CS).
I am still constantly amazed that the vast majority of students are not taught what to do when a program they've written doesn't work. You see a lot of 'twist' this and 'adjust' that and then pray it now magically works.
Basic debugging is not rocket science, but it's a specific methodology and it's not intuitive. Yet teachers and professors seem to assume that the concept of calculating what the program should do on paper, seeing what it is actually doing in fact, and then determining the reason for discrepancies between the two should somehow be intuitive for students. (And it is, for about 10%.)
Spending an hour or two teaching how to debug would save students dozens of hours and probably decrease the drop rate substantially. (Note, debugging theory doesn't actually require a sophisticated debugger - print statements can do just as well for basic debugging.)
If you can't figure out HOW to use source control, you picked the wrong major. If you don't think you need source control, that's why you're still in school.
Formating code is really important.
In many situations, you will be the most technically savy person at the table, so don't abuse the Technical Trump Card too often to shoot down other ideas. You don't like it when the CFO keeps saying everything is too expensive.
Go back to your code every 6 months and if you don't ask yourself "WTF was I thinking?" you've stopped learning, so get started. And when you're taking over the other guy's code, you're probably not under the same time pressure he had to endure because it was impreritive that the feature be available the next morning at 7:00 AM for the VC firm balancing their 100 million dollar decision on it (At least that's what they told him).
Most programmers in real life do not wear costumes to work like they do on TV/movies. If you get credited with programming some super successful application, you can get away with being eccentric; otherwise, you're just an unemployed slob.
Having taught at college for several years, I'm sure that 1) I have a very different view than most people here, and 2) my vote probably doesn't count much being seen as "Part of the problem".
However, my view is that the thing that is most needed is flexibility. It's not important to teach specific tools, and specific languages because in 5 years, those will all be out-dated. It's important to teach general techniques and ideas.
So, explaining why version control is important, and maybe showing them how to use one. Teaching them SEVERAL different languages, so that when the next Python, or Ruby or sequoia comes along, they're ready, is important. Learning how to pick up a book and learn what you need in an hour is vital, because the thing that you ABSOLUTELY Need to know 5 years from now may not exist yet.
The alternative is to turn CS programs in college into technical schools, the equivalent of car mechanics.
(1) Working in a team... tough to do in school because of the whole how-do-you-calc-a-grade problem, but certainly needs to be done
(2) working with REAL api's that don't do what you expect or are buggy. I understand that MIT does this sort of thing by teaching how to control a robot with existing Python libraries
(3) Handling the Awkward Squad -- which might be better put in learning what purity is, why it matters, and where you want it.