I'm loading pixels from an image which is 32 w by 32 height. The format I'm loading them in is ARGB via java. When I bind this to the video card, I can expect that the video card might use somewhere around 32*32*4 bytes, or 4K.
Similarly, 1024 w, 1024 h would be 1024*1024*4 = 4MB.
Is my understanding correct? Now I understand where all the memory goes!