views:

181

answers:

3

It seems like none of the algorithm textbooks mentions about space efficiency as much, so I don't really understand when I encounter questions asking for an algorithm that requires only constant memory.

What would be an example of a few examples of algorithms that uses constant memory and algorithms that doesn't use constant memory?

+2  A: 

Very easy example: counting a number of characters in a string. It can be iterative:

int length( const char* str )
{
    int count = 0;
    while( *str != 0 ) {
       str++;
       count++
    }
    return count;
}

or recursive:

int length( const char* str )
{
    if( *str == 0 ) {
        return 0;
    }
    return 1 + length( str + 1 );
}

The first variant only uses a couple of local variables regardless of the string length - it's space complexity is O(1). The second if executed without recursion elimination requires a separate stack frame for storing the return address and local variables corresponding to each depth level - its space complexity is O(n) where n is string length.

sharptooth
You didn't explain what is constant memory and what isn't. It might not be obvious to a beginner.
the_drow
Very true.Updated the answer.
sharptooth
so basically constant-memory algorithms are just non-recursive algorithms?
Not necessarily. You need to look how much memory is consumed depending on the input parameters. It's easy to imagine the string length computation rewritten to directly use a stack instead of recursive calls. The example I provide is just a very easy one and my favourite.
sharptooth
Recursive algorithms are non-constant in memory. Normal algorithms can allocate memory, thus you can't say that when a function isn't recursive, it uses constant-memory.
Dykam
Meant at @ john.
Dykam
@john: a non-recursive algorithm can be linear-memory if for instance it takes a copy of the string. Obviously you wouldn't do that just to measure its length, but for instance there's a big difference in memory usage between quicksort, which requires O(log N) extra memory provided you do the the "small half" first, and mergesort for an array, which allocates a working array of size N/2 and hence requires O(N) extra memory.
Steve Jessop
-1 Stack frames are implementation details, not a property of algorithms. Some recursive algorithms (including your example) can be optimized through tail recursion to avoid needing multiple stack frames.
Michael Borgwardt
@Michael Borgwardt: True and I've specifically mentioned it. Just saying that using the system stack is an implementation detail is misleading - it makes people think that it's magic that makes recursion possible.
sharptooth
You can't really win either way - if you say that stack use is "not part of the algorithm", then you get a lot of misleading results where almost anything can be done in constant space by throwing enough recursion at it. If you say stack use is "part of the algorithm", then you leave open the question, "stack use in what language/compiler/system?". I'm not a computer scientist, and I don't know what's usual, but sharptooth's approach seems sensible to me, which is to make and state an assumption about the implementation, and analyse the algorithm subject to that assumption.
Steve Jessop
... obviously the recursive algorithm can be made non-recursive. That's precisely the grounds on which sharptooth chose it, so that he can demonstrate the difference!
Steve Jessop
+5  A: 

If an algorithm:

a) recurses a number of levels deep which depends on N, or

b) allocates an amount of memory which depends on N

then it is not constant memory. Otherwise it probably is: formally it is constant-memory if there is a constant upper bound on the amount of memory which the algorithm uses, no matter what the size/value of the input. The memory occupied by the input is not included, so sometimes to be clear you talk about constant "extra" memory.

So, here's a constant-memory algorithm to find the maximum of an array of integers in C:

int max(int *start, int *end) {
    int result = INT_MIN;
    while (start != end) {
        if (*start > result) result = *start;
        ++start;
    }
    return result;
}

Here's a non-constant memory algorithm, because it uses stack space proportional to the number of elements in the input array. However, it could become constant-memory if the compiler is somehow capable of optimising it to a non-recursive equivalent (which C compilers don't usually bother with except sometimes with a tail-call optimisation, which wouldn't do the job here):

int max(int *start, int *end) {
    if (start == end) return INT_MIN;
    int tail = max(start+1, end);
    return (*start > tail) ? *start : tail;
}

Here is a constant-space sort algorithm (in C++ this time), which is O(N!) time or thereabouts (maybe O(N*N!)):

void sort(int *start, int *end) {
    while (std::next_permutation(start,end));
}

Here is an O(N) space sort algorithm, which is O(N^2) time:

void sort(int *start, int *end) {
    std::vector<int> work;
    for (int *current = start; current != end; ++current) {
        work.insert(
            std::upper_bound(work.begin(), work.end(), *current),
            *current
        );
    }
    std::copy(work.begin(), work.end(), start);
}
Steve Jessop
actually - wouldn't finding the maximum of n values not require O(n) space since you need to have n values in the first place? A constant algorithm would need an inputstream or so I guess.
Tobias Langner
No. "The memory occupied by the input is not included, so sometimes to be clear you talk about constant "extra" memory."
Steve Jessop
+1  A: 

Take a sorting algorithms on an array for example. You can either use an new array of the same length as the original array where you put the sorted elements into (Θ(n)). Or you sort the array in-place and just use one additional temporary variable for swapping two elements (Θ(1)).

Gumbo