views:

36

answers:

2

I'm looking at working with a neuroscience team next year. Some of the applications will involve signal processing, image processing, and data-mining. (I do know there will be some MATLAB work, but otherwise, much is left uncharted.)

What development experiences have you had in these areas and what techniques, domain-specific terms, methods, and software engineering models, should I read up on?

Many thanks in advance.

A: 

There was an Mac Classic tool called NIH Image now horribly obsolete, but the about page for that software points to Image/J which appears to be under active development.

dmckee
+1  A: 

This is really organization dependant. In academia, people use mostly FORTRAN, Matlab and R for the things you describe, at least where I worked/studied.

However, some do C and C++, others do Java.

In corporate organizations, you may see some anachronic languages like SAS (popular in risk management in some banks)

Matlab is a sure bet though, since it has a huge base of happy users. For the other things you want to know, it is too much field dependant. Some communities use some tools, others other tools. Even inside the same field of study you will see a lot of discrepancy.

Ask them, or wait to work with them and see.

Alexandre C.
Getting a little FORTRAN exposure right now. One more reason to take good notes.
Old McStopher
Thanks for the R and SAS mentions. I've been told I need to explore those as well.
Old McStopher
Don't waste your time with SAS if you're not 100% sure you will have to use it; it is really old and sh*tty. As to R, it is a pretty language, and very enjoyable to program in. It moreover has a good user base and a lot of 3rd party modules.
Alexandre C.