In his November 1, 2005 C++ column, Herb Sutter writes ...
int A[17];
int* endA = A + 17;
for( int* ptr = A; ptr < endA; ptr += 5 )
{
// ...
}
[O]n some CPU architectures, including current ones, the aforementioned code can cause a hardware trap to occur at the point where the three-past-the-end pointer is created, whether that pointer is ever dereferenced or not.
How does a CPU trap on a bitpattern? What about ...
int A[17];
// (i) hardware will trap this ?
int *pUgly = A + 18;
// (ii) hardware will trap this, too?
int *pEnd = A + 17;
++pEnd;
// (iii) will this fool it?
int *precious = A + 17;
unsigned long tricksy = reinterpret_cast<unsigned long>(precious) ;
++tricksy;
int *pHobbits = reinterpret_cast<int *>(tricksy);
Bonus question: Should the phrase "some current CPU architectures" be ordinarily understood to refer to shipping products only, or does it include imaginary architectures as well if the work of fiction in which they are described or alluded to has a recent publication date?