tags:

views:

104

answers:

3

Possible Duplicate:
What's the difference between new char[10] and new char(10)

what is different between

char* t1=new char

and

char* t2=new char[10];

both allocate memory and t1[100]='m' and t2[100]='m' is correct for them

-----------after edit:

but why we can use t1[100] if t1 is dynamically allocated char not array of char

A: 
Armen Tsirunyan
+3  A: 

You need to delete these differently since arrays are allocated using a different variant of operator new:

delete t1;
delete [] t2;
Steve Townsend
+4  A: 

Your first case creates a single char element (1 byte) whereas your second case creates 10 consecutive char elements (10 bytes). However, your access of t(x)[100]='m' is undefined in both cases. That is, you are requesting 100 bytes after the position of the pointer, which is most likely garbage data.

In other words, your assignment of 'm' will overwrite whatever is already there, which could be data from another array. Thus, you may encounter some bizarre errors during runtime.

C/C++ allows programmers to access arrays out of bounds because an array is really just a pointer to consecutive memory. The convention t1[100] is just 100 bytes after the pointer, no matter what that is.

If you want "safe" arrays, use the vector class and invoke the at() function. This will throw the out_of_range exception if the access is invalid.

Stroustrup gives the following example:

template<class T> class Vec : public vector<T> {
public:
    Vec() : vector<T>() {}
    Vec(int s) : vector<T>(s) {}

    T& operator[] (int i) {return at(i);}
    const T& operator[] (int i) const {return at(i);}
};

This class is boundary-safe. I can use it like this:

Vec<char> t3(10);                // vector of 10 char elements
try {
    char t = t3[100];            // access something we shouldn't
}
catch (out_of_range) {
    cerr << "Error!" << endl;    // now we can't shoot ourselves in the foot
}
chrisaycock