Pointer arithmetic isn't the fundamental problem. GC's have to deal with pointers being reassigned all the time, and pointer arithmetic is just another example of that. (Of course, if pointer arithmetic between pointers pointing to different buffers was allowed, it would cause problems, but it isn't. The only arithmetics you're allowed to perform on a pointer pointing into an array A are the ones that repositions it within that array.
The real problem is the lack of metadata. A GC has to know what is a pointer and what isn't.
If it encounters the value 0x27a2c230
, it has to be able to determine if it is
- a pointer (in which case it has to follow the pointer to mark the destination as "in use" recursively)
- An integer (The same value is a perfectly valid integer. Perhaps it's not a pointer at all)
- or something else, say, a bit of a string.
It also has to be able to determine the extent of a struct. Assuming that value is a pointer, and it points into another struct, the GC has to be able to determine the size and extent of that struct, so it knows which range of addresses should be scanned for more pointers.
GC'ed languages have a lot of infrastructure to deal with this. C++ doesn't.
Boehm's GC is the closest you can generally get, and it basically just assumes everything is a pointer, which means some data is needlessly kept alive.
Alternatively, of course all this infrastructure could in principle be added to a C++ compiler. There's no rule in the standard that it's not allowed to exist. The problem is that it would be a major performance hit and eliminate a lot of optimization opportunities.