There's no exact answer to the question, since the amount of memory allocated for different objects of the same type might not even be the same (e.g. QSomething A might be able to reuse some data from a cache whereas QSomething B might have to allocate it separately, etc).
What you could do, I suppose, is write a trivial test program that starts up, allocates N of the object in question, then goes to sleep() for a long time. While the program is sleeping, use Task Manager (or whatever tool you prefer) to see how much RAM the process is using. Then ctrl-C (or kill) the process, and run it again with a larger value for N, and repeat the measurement. Repeat that process and eventually you'll get an idea of how the process's RAM allocation grows with the number of items allocated, and then you can do a little algebra to get a ballpark idea of the average memory cost per object.
(keep in mind that there's a good bit of memory overhead just in starting the process, so subtract the memory used by the N=0 case from all the cases so that you're measuring just the objects' costs and not the environmental overhead)