Scientific computation is two-fold.
Quick prototyping
On one hand you may need to write a lot of prototype code, and need to write it fast. Often, this code is used just once. So, there is a need in simple and expressive languages with solid library support. In my opinion, Python is the best suited for this purpose. And I hope it will finally dethrone matlab in this area. I don't know any of the functional languages which can compete with Python right now.
Performance computing
On the other hand, you may need to solve computationally intensive problems, and performance is important. So, you need an optimizing compiler and parallel computations (both multi-core and multi-machine). And you need to make it work on clusters (i.e. on Linux) and support standard parallel APIs (MPI and OpenMP).
From you list probably only Scheme is not suitable for performance computing. The others may or may not be OK. I don't know. Anyway, the result will usually be 2 or 3 times slower than pure hand-optimized C/C++/Fortran/Java.
I know that Haskell used to lag behind in this area, but with Data Parallel Haskell this may change. It's status is technology preview right now (and stable in ghc 6.12?).
There is also a field of symbolic computing, which I am fairly remote from. I expect that some of the functional languages may really shine in this area if there are suitable libraries.
Shootout
I think you can also consult shootout.alioth.debian.org to see the performance limits in similar number-crunching tasks on a multi-core CPU. Sure, pure C rules them all, but most of the compiled functional languages are good enough:
Double-precision N-body simulation
Eigenvalue using the power method
Libraries
Scientific computing depends on the existence of libraries in your domain (unless you are ready to write your own). Just for the reference:
Numeric and scientific libs for Python
Haskell math libraries
OCaml math libraries