I was advised a while ago that is was common place to use std::vector as exception safe dynamic array in c++ rather than allocating raw arrays... for example
{
std::vector<char> scoped_array (size);
char* pointer = &scoped_array[0];
//do work
} // exception safe deallocation
I have used this convention multiple times with no problems, however I have recently ported some code to Win32 VisualStudio2010 (previously it was only on MacOS/Linux) and my unit tests are breaking (stdlib throws an assert) when the vector size happens to be zero.
I understand that writing to an such an array would be a problem, but this assumption breaks this solution as a replacement to raw pointers. Consider the following functions with n = 0
void foo (int n) {
char* raw_array = new char[n];
char* pointer = raw_array;
file.read ( pointer , n );
for (int i = 0; i < n; ++i) {
//do something
}
delete[] raw_array;
}
While arguably redundant, the above code is perfectly legal (I believe), while the below code will throw an assertion on VisualStudio2010
void foo (int n) {
std::vector<char> scoped_array (n);
char* pointer = &scoped_array[0];
file.read ( pointer , n );
for (int i = 0; i < n; ++i) {
//do something
}
}
Was I using undefined behavior all along? I was under the impression operator[] did no error checking, and this was a valid use of std::vector<>. Has anyone else encountered this problem?
--edit: Thanks for all the useful responses, in reply to people saying it is undefined behavior. Is there a way to replace the raw array allocation above which will work with n = 0?
While saying that checking for n=0 as an exceptional case will solve the problem (it will). There are many patters where no special case is needed (such as the raw pointer example above) so maybe using something other than std::vector<> would be needed?