views:

193

answers:

4

Martinus gives a good example of a code where compiler optimizes the code at run-time by calculating out multiplication:

Martinus code

int x = 0;
for (int i = 0; i < 100 * 1000 * 1000 * 1000; ++i) {
    x += x + x + x + x + x;
}
System.out.println(x);

His code after Constant Folding -compiler's optimization at compile-time (Thanks to Abelenky for pointing that out)

int x = 0;
for (int i = 0; i < 100000000000; ++i) {
    x += x + x + x + x + x;
}
System.out.println(x);

This optimization technique seems to be a trivial in my opinion. I guess that it may one of the techniques Sun started to keep trivial recently.

I am interested in two types of optimizations made by compilers:

  1. optimizations which were omitted in today's compilers as trivial such as in Java's compiler at run-time
  2. optimizations which are used by the majority of today's compilers

Please, put each optimization technique to a separate answer.

Which techniques have compilers used in 90s (1) and today (2)?

+1  A: 

The optimization shown in your example, of collapsing 100*1000*1000*1000 => 100000000000 is NOT a run-time optimization. It happens at compile-time. (and I wouldn't even call it an optimization)

I don't know of any optimizations that happen at run-time, unless you count VM engines that have JIT (just-in-time) compiling.

Optimizations that happen at compile-time are wide ranging, and frequently not simple to explain. But they can include in-lining small functions, re-arranging instructions for cache-locality, re-arranging instructions for better pipelining or hyperthreading, and many, many other techniques.

EDIT: Some F*ER edited my post... and then down-voted it. My original post clearly indicated that collapsing multiplication happens at COMPILE TIME, not RUN TIME, as the poster suggested. Then I mentioned I don't really consider collapsing constants to be much of an optimization. The pre-processor even does it.

Masi: if you want to answer the question, then answer the question. Do NOT edit other people's answers to put in words they never wrote.

abelenky
100*1000*1000*1000 => 100000000000That is very much an optimization. It's called constant folding.
Pod
If you wouldn't call that an optimization, then you're wrong.
mquander
+3  A: 

Just buy the latest edition of the Dragon Book.

Pod
@Pod: I have one. Could you recommend some pages?
Masi
+1  A: 

How about loop unrolling?:

for (i = 0; i < 100; i++)
    g ();

To:

for (i = 0; i < 100; i += 2)
{
    g ();
    g ();
}

From http://www.compileroptimizations.com/. They have many more - too many for an answer per technique.

Check out Trace Trees for a cool interpreter/just-in-time optimization.

Corbin March
@Corbin: Thank you for the great links!
Masi
A: 

Compiler books should provide a pretty good resource.

If this is obvious, please ignore it, but you're asking about low-level optimizations, the only ones that compilers can do. In non-toy programs, high-level optimizations are far more productive, but only the programmer can do them.

Mike Dunlavey
Your answer suggests me that compilers cannot do high-level optimizations. Is that right?
Masi
Right (I mean maybe there is a little bit they can do) but to do high-level optimizations effectively you need something that identifies high-cost call instructions, and that allows you to see the reason for what is being done, so you can see if there is a more direct way to do it. Only the programmer can see the reason.The link above shows a pretty good example.
Mike Dunlavey
I accept this answer, since I did not know about high-level optimizations explicitly.
Masi