views:

570

answers:

14
vector<int>::iterator it;
vector<int> p;
p.push_back(4);
p.push_back(5);
p.push_back(6);
p.push_back(7);
it = p.begin() + 2;
cout << it << endl;

Is this O(N) or O(1)? And why?

+14  A: 

It is O(1) because it has constant number of operations.

Naveen
I can redefine the vector class and make this untrue.
Stefan Kendall
keep in mind that all of the standard containers have efficiency requirements, so if you did that, you would not have a conforming implementation.
Evan Teran
You could redefine the vector methods to something very inefficient, but reasonable. Like having push_back resize the vector's storage every time or the iterator doing a linear search through the vector each time; making each of them O(n). But push_back is only called 4 times, not N times. The other methods are only called one time each, making this code snipped O(1).
Sean McCauliff
@Sean, if you did that, it would **not** be a std::vector anymore, because the standard mandates amortized constant time for all push_back operations. So no, it would not be reasonable to have it resize the storage every time.
Evan Teran
@Sean A standards-conforming implementation std::vector::push_back is required to have an amortized complexity of O(1). Resizing the vector for every push would not meet that requirement; that's what Evan was talking about.
Tyler McHenry
@Sean even if push_back was O(n) the snippet is still O(1) because it would execute the push_backs in a constant finite number of operations (4 pushbacks always happening). I think that might have been the point you were trying to make, but I wasn't sure.
Joseph
it's constant number of operations, but refereeing to complexity you need to define an input here. so it could be event an exponential time independently of implementation.
Artem Barger
@Artem Barger: Then you don't have a standard-conforming C++ implementation. Time complexity of .push_back() is amortized to O(1), or it isn't standard-conforming. We have a fixed number of O(1) operations, which is O(1).
David Thornley
@Evan,Tyler,Joseph,iftrue. I should have been explicit that I was replying to iftrue's redefine vector argument; that unless vector was being redefined by someone with malicious intent it seems this would still be O(1). It's unlikely that the vector in question would be something other than std::vector, but the code does not say in explicitly.
Sean McCauliff
@david, actually I don't care about it. And I wasn't talking about changing the implementation. Example: suppose you have an input number n, then after you have a loop which iterates n times, since presentation of n is log(n) therefore you have an exponential algorithm although at first sight it looks O(n). so you need specify you input in order to compute the complexity.
Artem Barger
@Sean, David,...: I think iftrue was indeed thinking about redefining it with malicious intend. And with that in mind: The code snippet just uses some class called `vector`, not necessarily `std::vector`. So no efficiency requirements there.
sth
+2  A: 

Depends on the operations you're looking at, but it should be O(1).

Creating the vector and iterator are constant time, as it's memory allocation. Pushing onto the vector is constant time depending on implementation, but I believe the STL impl is constant time. Four operations of constant time pushing are constant. Setting the iterator is constant time, because getting the begin of the vector is constant and addition is constant.

Finally, printing is constant time, so the whole process is O(1).

Stefan Kendall
+1  A: 

It is O(1). In order for O(n) or O(not 1) to even be relevant you need to have at least one variable that (when changed) will impact the performance of your algorithm by changing the total amount of iterations or work that needs to be done.

Robert Venables
It's not "closest to" anything. It *is* O(1).
Tyler McHenry
+6  A: 

Each operation is (amortized) constant time, so the entire thing is constant time.

Platinum Azure
Important to note that vector may have to resize itself.If it makes itself 1000 elements bigger each time it resizes, it becomes (n) (amortized) towards infinity.If it makes itself twice as big each time, it is amortized order of operations (1) (amortized) towards infinity.
Fair enough. I was assuming a doubling. (I believe the STL states this about vectors) Also, with regard to just four elements, it's quite likely a resize would happen either zero times (if initialized to more than 4 capacity, and I think that's the case for STL vectors with 10 as the default), once (assuming a good enough resize), or, worst case, on pretty much every element add (assuming start size of one and increase of one or doubling every time). In the case of this particular problem, that makes everything pretty close to constant time, in actuality if not in theory. :-)
Platinum Azure
The *O* notation is not about time but about complexity.
Gumbo
I was referring to time complexity, which is one particular kind of complexity and usually the one people care about the most. Space or memory complexity is a fairly close second. My answer is not wrong.
Platinum Azure
@dnh828: For exponential resizing, wouldn't it be (lg n) (amortized) rather than (1) (amortized)? Of course, that makes both of us wrong since I used your claim earlier. Silly me. :-)
Platinum Azure
+8  A: 

It is O(1) since its number of operations are fixed. For something to be O(N) there has to be linear variability in how many operations are performed.

Matthew Manela
A: 

It is O(c) where c is a constant because nowhere is a non constant-time operation, but is it O(1)? I guess it is assumed the cost of all these operations is 1.

Ian Buss
Big-O notation is asymptotic. O(c) is *exactly the same in every possible way* as O(1)
Tyler McHenry
you're absolutely right, my bad.
Ian Buss
that's not what O(1) means. O(1) means that the worst case runtime is constant, not that the cost is "1". See http://en.wikipedia.org/wiki/Constant_time
Evan Teran
A: 

Really it depends on what push_back and begin do internally. If either (or both) is O(n) then the whole thing is O(n), because O(N) dominates O(1) and there is a constant number of them. Same goes for anything greater than O(1), such as O(nlogn). If both are O(1) then the whole thing is O(1) due to there being a constant number of them. Most likely it is O(1) since push_back and begin are usually very simple and written to be O(1).

Jwsonic
The standard says that vector.push_back is amortized constant time.
MSN
A: 

Pushing onto a vector does not happen in constant time in the worst case. If your vector is full, it will have the copy the contents into a larger vector before continuing. This is an O(n) operation.

Ryan Fox
I think specification states the amortized complexity, so in the strict worst case you right it's O(n) but in general case it's still O(1), since the worst case happens really rare.
Artem Barger
A: 

My guess is you're asking about p.begin() + 2; (most people care about searches more than, say, insertion).

It's simple pointer arithmetic so yes, it's constant-time O(1). If this was a linked-list traversal, then it would be O(n). Of course, this assumes that the implementation of the integer vector list is akin to an array -- namely they're all in contiguous blocks of memory.

See: http://www.cprogramming.com/tutorial/stl/iterators.html

Iterators are often handy for specifying a particular range of things to operate on. For instance, the range item.begin(), item.end() is the entire container, but smaller slices can be used. This is particularly easy with one other, extremely general class of iterator, the random access iterator, which is functionally equivalent to a pointer in C or C++ in the sense that you can not only increment or decrement but also move an arbitrary distance in constant time (for instance, jump multiple elements down a vector).

Matt Rogish
As of C++98-TC1 std::vector is required to store elements contiguously.
Ozan
Yep, just covering my bases because someone will say "No, because FooBarBaz iterators are actually Binary Trees!" :D
Matt Rogish
+9  A: 

As everybody else said, it is O(1). However, if you meant to write something like this:

vector<int>::iterator it;
vector<int> p;
p.push_back(4);
p.push_back(5);
p.push_back(6);
p.push_back(7);
...
p.push_back(n);
it = p.begin() + 2;
cout << *it << endl;

i.e.:

for (int i=4;i<=n;i++)
  p.push_back(i);

then it would be O(n), since vector::push_back() and vector::begin() are O(1) and vector::push_back() is executed n-3 times.

Ozan
A: 

I think the answer depends on how the '+' operator works. If it is simple pointer arithmetic then it is O(1).

russoue
A: 

It can be anything from O(1) to exponential and really depends on implementation of push_back.

A: 

As long as vector's functions are implemented in a deterministic fashion (which they should be or something is seriously wrong), it's O(1). Period.

The entire discussion about the implementation of vector's functions is moot because the value of n is known. This snippet will always run the exact same code and take the exact same number of clock cycles on a given machine with a given vector implementation. Swap out the machine or the implementation and it will still be O(1); it will just take a different amount of constant time.

Remember, folks, O(1) doesn't necessarily mean "fast." It means "very scalable."

Evan Shaw
A: 

Are you asking about it = p.begin() + 2;?

Accessing elements of a vector in this way is amortized constant time (from a guarantee on Random Access Iterator), just like writing p[2]. So, statements like it = p.begin() + n; are O(1).

It's not correct to talk about O(N) without telling us what N is, but I'm assuming you're talking about the array lookup with N being the number of elements. Also you probably mean cout << *it << endl; in the last line.

Sumudu Fernando