Hi guys, probably in other countries things are like here in Brazil, and in the end of the college degree in Computer Science you have to develop, document and present a final project in some area. I'm just about to finish the course and I haven't started the project yet, well, I did it a lot of times but I wouldn't decide about what to keep with, since it was not interesting at all. I have troubles to find a nice, interesting and emerging area in CS. I wouldn't like to make a project based on use of a language, like a portal, an e-commerce, etc, language must be the tool. But then I fall in this question, which areas in CS are emerging in late years? I'd like to hear some suggestions! Thanks.
An emerging area is ontology language development and practice. An ontology is a language that represents a web of concepts organized into a series of relationships where those relationships may even be hierarchical. The objective is to identify and represent context of language or expression so that a communicate can interpret and respond to human communication as a human might.
Cloud computing is hot right now and still largely undefined, unproven.
What about quantum computing?
http://en.wikipedia.org/wiki/Quantum%5Fcomputer
A prof at my old university was researching this area pretty heavily. Might be a tad advanced but definitely interesting.
Genetic algorithms continue to be big IMHO. They are fairly easy to implement (though tweaking the fitness function can be a long and arduous process), and can be applied to nearly any project. I know of cases where its been used to predict the stock market and also to find the best solutions for manufacturing processes.
The death of the RDBMS and what will supersede it.
e.g
1. map-reduce
2. key/value store
3. Document databases
4. Column Databases
I have this awful feeling that a lot of people are going to confuse computer science with computer technology
Remember the quote of Edsger Dijkstra: "Computer science is no more about computers than astronomy is about telescopes."
Anything that makes concurrent programming easier.
While concurrent programming as such has been researched and understood for decades (i.e. the theory is there), practical application has lagged behind a lot until recently, because single-core CPUs were the norm.
Only recently have multi-core CPUs become widespread, and languages like Erlang that support concurrent programming without the headaches caused by shared memory have lost their niche status and come into the limelight.
Machine Learning algorithms. There are quite a few unsolved problems in computer science/bioinformatics/etc just waiting to be solved using them. Getting a computer to recognize patterns, or even generating data from what it was trained on is a powerful concept which has a wide range of applications.
Automated testing methods are an interesting topic.
There are some emerging companies experimenting with model based testing. They say it's the cat's whiskers, I say it's the cat's turd in the lawn.
The truth will be somewhere in the middle I presume, though nobody really knows right now where in the middle.