views:

2791

answers:

4

What is meant by "Constant Amortized Time" when talking about time complexity of an algorithm?

A: 

A very trivial search engine query for "Constant Amortized Time" turned that up in the first hit:

The special case of an amortized time of O(1) signifies that a sequence of n such operations takes only time O(n). One then refers to this as constant amortized time.

Tomalak
Please, guys... Are we the mechanical turk Google user interface? Where is that a down vote?
Tomalak
Since one of the goals with SO is to become _the_ repository for questions picked up by google and others, you create a kind of useless circle by just linking to a google query yourself.
Mats Fredriksson
Oh, right. Is SO going to replace Google and with it the need to think before you ask? But I will break the "endless circle" and remove the link.
Tomalak
Of course it will not replace google. If you think a question is too simple why don't you avoid posting a reply in the first place?
Mats Fredriksson
For the same reason someone put up a site named http://justfuckinggoogleit.com/. Sometimes I can't help it.
Tomalak
Very true. If a friend ask those questions, I also tell the to go to google, but this is not the same. Here people actually ask questions and except answer.. A time and place for everything..
Mats Fredriksson
And what about that fact entitles them to "expect" an answer to trivial questions that do not pose any real problem at all? Does that mean I can go through the index of a book on programming and ask every triviality that comes to mind? (Besides, I answered the question.)
Tomalak
Go ahead, post those questions. If they are relevant they will be modded up, if not, they will be deleted.
Mats Fredriksson
I agree with Tomalak here. This kind of question, at best, deserves an answer _pointing_ to good resources on the net and asking the OP to study up on it. The aim of SO is not to rebuild all the info in the Web. It is to help people find answers, assuming they are ready to put in efforts themselves.
sundar
+3  A: 

It basically means that in the worst case scenario the algorithm runs in constant time, averaged over a large number of operations. Which is nice.

Rik
+11  A: 

It means that over time, the worst case scenario will default to O(1), or constant time. A common example is the dynamic array. If we have already allocated memory for a new entry, adding it will be O(1). If we haven't allocated it we will do so by allocating, say, twice the current amount. This particular insertion will not be O(1), but rather something else.

What is important is that the algorithm guarantees that after a sequence of operations the expensive operations will be amortised and thereby rendering the entire operation O(1).

Or in more strict terms,

There is a constant c, such that for every sequence of operations (also one ending with a costly operation) of length L, the time is not greater than c*L (Thanks Rafał Dowgird)

Mats Fredriksson
"after a large enough amount of operations" - constant amortized time doesn't need this condition. There is a constant c, such that for *every* sequence of operations (also one ending with a costly operation) of length L, the time is not greater than c*L.
Rafał Dowgird
+19  A: 

Amortised time explained in simple terms:

If you do an operation say a million times, you don't really care about the worst-case or the best-case of that operation - what you care about is how much time is taken in total when you repeat the operation a million times.

So it doesn't matter if the operation is very slow once in a while, as long as "once in a while" is rare enough for the slowness to be diluted away. Essentially amortised time means "average time taken per operation, if you do many operations". Amortised time doesn't have to be constant; you can have linear and logarithmic amortised time or whatever else.

Let's take mats' example of a dynamic array, to which you repeatedly add new items. Normally adding an item takes constant time (that is, O(1)). But each time the array is full, you allocate twice as much space, copy your data into the new region, and free the old space. Assuming allocates and frees run in constant time, this enlargement process takes O(n) time where n is the current size of the array.

So each time you enlarge, you take about twice as much time as the last enlarge. But you've also waited twice as long before doing it! The cost of each enlargement can thus be "spread out" among the insertions. This means that in the long term, the total time taken for adding m items to the array is O(m), and so the amortised time (i.e. time per insertion) is O(1).

Artelius