When programmers versed in a wide variety of languages are discussing the merits of an algorithm expressed in pseudocode, and the talk turns to efficiency, how much should be assumed about the performance of ultimate the language?
For example, when I see something like:
add x to the list of xs
I see an O(1) operation (cons), while someone else may see an O(n) array copy-to-extend or worse.
The harder you look, the more such cases there are, at least potentially. For example,
add x to the running total
may look O(1) to most of us (integer addition, right?) but it could plausibly be O(ln(max(x,total))) if the numbers were very large (BigNums). Or it could be O(x) if add were implemented as
b = a
while b < 0
inc total
dec b
and I'll bet there are even worse ways to do it.
So what's reasonable? Always assume the most efficient possible implementation (even though no extant language provides it) or...?
Edit:
To stir the pot some more here:
I note several people saying you should state your assumptions.
Fine, but suppose I assume (as I do) that advancing to the next line of code is free. Should I have to state that? Someone could, after all, be imagining some sort of mechanical implementation where going to the next line is a time consuming step. Until they bring it up, I might never have considered the posibility.
I'd contend for this reason that it isn't possible to state all your assumptions.
So my question is really "since there are some assumptions can you make without having to state them, and some you can't, where do you draw the line"?