I don't think Knuth objected to 64-bit systems. He just said that using 64-bit pointers on a system that has less than 4GB ram is idiotic (at least if you have lots of pointers like the ones in a double-linked list). I can't say that I agree with him, here are 3 different ways that can be taken. Let's assume you have a 64-bit capable CPU that can also run in 32-bit mode like some Intel Core Duo.
1 - Everything is 32-bit, the OS, the APPZ, all of them. So you have 32-bit pointers but you can not use the extra registers/instructions that are available on 64-bit mode.
2 - Everything is 64-bit, the OS, the APPZ, all of them. So you have 64-bit pointers and you can use the extra registers/instructions that are available on 64-bit mode. But as you have less than 4GB ram, using 64-bit pointers seems like idiotic. But, is it ?
3 - OS is 64-bit and OS interestingly makes sure that all the code/data pointers are in the 0x00000000 - 0xFFFFFFFF range (Virtual Memory !!!). The ABI runs in a very strange way that all the code/data pointers kept in memory/files are 32-bit wide but they are loaded into 64-bit registers as zero-extended. If there is a code location to jump, compiler/ABI does the necessary fix-ups and does the actual 64-bit jump. This way, pointers are 32-bit but APPZ can be 64-bit meaning they can make use of the 64-bit registers and instructions. This process is something like thunking, I think ;-P
My conclusion is ::
The 3rd option seemed doable to me but it is not an easy problem. In theory it can work but I do not think it is feasible. And I also think that his quote "When such pointer values appear inside a struct, they not only waste half the memory, they effectively throw away half of the cache." is exaggerated...