I am wondering what kind of optimization techniques people often use nowadays. I have seen people do caching all the time with dictionary and all. Is the trading space for speed the only way to go?
Depends on a lot of things, really.
As an example, when memory becomes an issue and a lot of temporary objects are being created I tend to use object pools. (Having a garbage-collector is not a reason to not take care of memory allocation). If speed is what matters then I might use unsafe pointers to work with arrays.
Either way, if you find yourself struggling too much with optimization techniques in a c#/.net application you probably chose the wrong language/platform.
In general, make sure you understand the time complexity of different algorithms, and use that knowledge to choose your implementations wisely.
For .NET in particular, this article goes into great detail about optimizing code deployed to the CLR (though it's also relevant for Java, or any other modern platform), and is one of the best guides I've ever read:
http://msdn.microsoft.com/en-us/library/ms973852.aspx
To distill the article into one sentence: Nothing affects the speed of a .NET application (with sensible algorithms) more than the memory-footprint of its objects. Be very careful to minimize your memory consumption.
There are often problems with algorithms as well, usually when something expensive is done inside of a loop. Generally, the first thing you do is profile your application, which will tell you the slowest part(s) of the application. Generally, what you do to speed up your application depends upon what you find. For example, if your application mimics a file system, it may be that you're calling the database recursively to travel up the tree (for instance). You may optimise that case by changing those recursive calls into one flattened database call that returns all of the data in one call.
Again, the answer is, as always, 'it depends'. However, more examples and advice can be found in Rico Mariani's blog (browse back a few years, as his focus has shifted):
Really it's about your choice in algorithms. Usually there is no "silver bullet" for optimization.
For example, using a StringBuilder
instead of concatenation can make your code significantly faster, but there is a tradeoff. If you aren't concatenating huge sets of strings, the memory and time it takes to initialize StringBuilder
is worse than just using regular concatenation. There are a lot of examples of this throughout the framework, such as dictionary caching as you mentioned in your question.
The only general optimization you can really learn and apply to your coding throughout your day is the performance hit from boxing/unboxing (heap vs. stack). To do this you need to learn what it's about and how to avoid, or reduce the need to do it.
Microsoft's MSDN documentation has 2 articles on performance that give a lot of good general purpose techniques to use (they're really just different versions of the same article).
I would recommend Effective C# by Bill Wagner (first edition and second edition). He goes through a number of language constructs and techniques and explains which ones are faster and why. He touches on a lot of best practices as well.
More often than not, however, optimizing your algorithm will give you far better results than using any kind of language / optimization technique.