As mentioned before, knowing how to learn is likely the number one skill I'd note and be aware of how you use it to learn new things over the next ten years. If you look at the past ten years as a kind of guide, who in 1998 would predict Google to be the behemoth that it is, Microsoft to stagnate for so long making Vista, and that Y2K wasn't really that big end of the world day? Problem solving skills as well as design patterns would be a couple of other things to know that can help bring someone to the next level.
Additionally, here would be my list of things to know or watch:
Concurrency and parallelism. Multi-core CPUs are becoming more common and how software will be written to take advantage of that will be something to be explored over the next decade as well as wondering how close to that 80-core chip Intel showed off within the past year is to reality?
Netbooks. These are still in their infancy, though it could be interesting to see how much of a smart phone and laptop get put into future netbooks as well as if Microsoft will make a special Windows operating system for something between Windows Mobile and Windows Vista? Will Apple enter this market and how close to an iPod would it be? How well can these little machines run the sites that consume a lot of resources or having lots of tabs that can suck up memory?
Rich Internet Applications. Flex and Silverlight are a couple of things starting to get some attention and these will likely mature over the next ten years but could just as easily end up being roadkill by something else that Google or even Microsoft could make if they decide to put more into ASP.Net instead of Silverlight or combine them somehow.
Cloud computing/Saas. This sort of ties into the previous one but I think it could be interesting to see what Microsoft, Oracle, Google and SAP do with this idea of "cloud computing" or "Software as a Service" but it is still very early on how these will evolve.
CPU/GPU fusion as well as the GPGPU. nVidia, AMD, and Intel are all spending tons of money in their graphics divisions with a couple of projects having high expectations: AMD's Fusion and Intel's Larrabee. Though nVidia's CUDA shouldn't be ruled out as something to potentially become a bigger deal. Lastly, there is what Microsoft will cook up in DirectX 11 and future generations of graphics processing that may be interesting to see what develops.
Virtualization. Companies replacing 4-5 servers with one physical one that runs like 4-5 servers. This is still somewhat in its infancy and companies are still working on how to harness this in the server space, but there may be desktop uses that I'd be curious to see if some developers want this so that they could run separate machines to handle client/server environments with one physical machine.
Grid computing. This is sort of like the cloud only with many many computers being used on a voluntary basis to do some hard problems in Computer Science for other fields. The World Community Grid has a few projects going on now that I wonder what kinds of results will they have in another 4-5 years as it may grow over the next few years.
This is of course going to be out of date in 2-3 years as other new technologies come to light. Just thinking about how memory has expanded greatly in the past decade may scare some people. I remember in 1998 on my first job out of university having a Pentium II 266 MHz with 64 MB of RAM and a 4 GB hard drive that I had to partition into 2 drives since NT 4.0 wouldn't take all 4 GB in one partitions. Now, I've got a Core 2 Duo e6750 2.66 GHz machine with 2 GB of RAM and a 160 GB hard drive that is in 1 partition. Will I have an 80 GB of RAM machine in 2018? Who knows....