Data Reader
About the fastest access you will get to SQL is with the SqlDataReader.
Profile it
It's worth actually profiling where your performance issue is. Usually where you think the performance issue is, is proven to be totally wrong after you've profiled it.
For example it could be:
- The time... the query takes to run
- The time... the data takes to copy across the network/process boundry
- The time... .Net takes to load the data into memory
- The time... your code takes to do something with it
Profiling each of these in isolation will give you a better idea of where your bottleneck is. For profiling your code, there is a great article from Microsoft
Cache it
The thing to look at to improve performance is to work out if you need to load all that data everytime. Can the list (or part of it) be cached? Take a look at the new System.Runtime.Caching namespace.
Rewrite as T-SQL
If you are doing purely data operations (as your question suggests) you could re-write your code which is using the data to be T-SQL and run natively on SQL, this has to potential to be much faster as you will be working with the data directly and not shifting it about.
If your code has a lot of nessecary procedural logic you try mixing T-SQL with CLR Integration giving you the benefits of both worlds.
This very much comes down to the complexity (or more procedural nature) of your logic.
If all else fails
If all areas are optimal (or as near as), and your design is without fault. I wouldn't even get into micro-optimisation, I'd just throw hardware at it.
What hardware? Try the reliability and performance monitor to find out where the bottle neck is. Most likely place for the problem you describe HDD or RAM.