Probably a decent optimizer will see that the loop is a no-op and optimize it out completely, so there will be almost no difference between start and end.
If it's not optimized out, the two loops are simply working around the fact that 30*640000000 is bigger than could be stored in a 32-bit integer. It runs the inner 640000000 loop 30 times to attempt to magnify the delay.
EDIT:
So for each of 30 times (using variable delay
), it creates another loop (using variable i
) starting at 0. It then increments i
640000000 times, each increment taking a small fraction of time (if not optimized away). Then the inner loop completes, delay
is increased by 1, and the inner loop starts over at 0 again.
EDIT2:
If you're just trying to add a delay, have you considered using sleep
or usleep
or the corresponding Windows function(s) rather than trying to implement a sleep by iteration?