tags:

views:

97

answers:

3

Say I have this:

Object *myObject;
std::vector<Object> objects(99);

....
myObject = &objects[4];

Would it be safe to assume that myObject's pointer will always be valid no matter how big objects[] gets and if I remove some, as long as I do not ever erase() the actual object pointed to by myObject? Or does it work differently? If the above does not work, how else could I achieve this? It's because in my design, I want a child to have a pointer to its parent, which is assigned by the parent when the child is added, however the parents are in a std::vector, I need random access. Would a std::list be better in my case?

Thanks

+5  A: 

No, it is definitely not safe to assume that.

The standard explains what will/won't invalidate iterators on standard containers; any time a vector iterator is invalidated, a pointer into it would be too. In practice this means that when the vector resizes (including implicitly when you call push_back()), any iterators pointing into it will be invalidated, as would your pointer. Similarly, calling erase() will invalidate pointers after the erased item because they all have to move up to fill the erased space.

A std::list would work better; you could maintain in parallel a vector of pointers to the items which would allow you to access them by index without them moving about in memory.

Peter
Okay, I think I'll use a list then, cache shouldn't be a burden for my needs
Milo
Or if you want a `vector`, you could do `int myObject = 4` and then always reference it as `objects[myObject]`.
Brendan Long
no that wont work since 4 might change if I remove 3
Milo
@milo - `list` is not a great idea as I understand your requirements (no `operator[]` for one thing). `list`'s main strength is for fast insert or removal at any point.
Steve Townsend
+1  A: 

Provided you keep the vector the same size, you can guarantee this by calling reserve() with the maximum number of elements after declaring it or (as you have done) declaring it with an initial number of elements each constructed to have the default element value.

If you remove or add elements, the underlying storage can get reallocated at any time.

Since you are using raw pointers here, you could use NULL as an "empty element" flag to keep the storage invariant. Since you are setting it up with 99 initially, they will all be NULL after that (the default value for any pointer as a vector element) and reserve is redundant unless you plan to extend the list.

An option that would allow you to not worry about the vector storage would be to store the elements as boost::shared_ptr<Object>. Then any removal of the vector element will only actually delete the referenced Object instance if nobody else is using it.

boost::shared_ptr<Object> myObject;
std::vector<boost::shared_ptr<Object> > objects(99);

myObject = &objects[4];

objects.clear();
// myObject still valid since Object instance referenced thru shared_ptr
Steve Townsend
+2  A: 

If the vector fills up its allocated space and you try to insert more, it needs to re-allocate its storage, and that will involve copying over the objects and destroying the old storage, which would invalidate the pointer you're keeping around - so no, this is not good enough.

std::list would work fine, since it doesn't rely on continuous storage, but you lose the ability to do quick random access. Alternatively, you can store pointers to your Objects in your collection, in which case you can just take out that element, and the pointer will remain valid until you free that memory in your code - but you will need to handle that yourself at some point.

Another option might be a deque; while deque iterators are invalidated by push_back, direct references to elements (like the pointer you're using here) remain valid.

Michael Madsen
Since I don't do random insertion, would dequeue be best?
Milo
@Milo: It depends on your requirements. If you need to delete stuff, then a deque is okay as long as you only delete the first or last element, but it's not okay if you delete elements in the middle of the collection - if you do that, you need something else. If that's not a concern, I'd go for a deque since you get random access in O(1), compared to O(n) for a std::list.
Michael Madsen
If that doesn't fit your needs, but you still want more efficient access, one more option *could* be a std::map or std::set, although it would work quite differently from your current solution: inserting or erasing from either of those doesn't invalidate iterators or pointers. Without knowing anything about your Objects, it's hard to say if that would be an appropriate collection class for you.
Michael Madsen
if I have around 100 elements at any given time, is O(n) really that noticeable?
Milo
@Milo: Probably not, but if you do random access frequently enough, it might add up. If you mainly iterate over the entire collection, there shouldn't be any real difference regardless of the number of objects. Again, it depends on your requirements and what you're trying to do.
Michael Madsen
It's a GUI API, so I will need to notify my widgets when I get messages
Milo
@Milo: In that case, my best guess is that "no, it doesn't make any noticeable difference". I think you should go with the one you think will be most useful: a deque if you think random access is best, or a list if you think random deletion is preferable. If neither of those properties matter, I would go for the deque because it seems slightly more likely that random access *might* matter at some point, compared to deletion (if only because you're using random access in your question).
Michael Madsen
alright thanks :) I'll upvote and accept your answer
Milo