For example, let's say I have an application that can read in a CSV file with piles of data rows. I give the user a summary of the number of rows based on types of data, but I want to make sure that I don't read in too many rows of data and cause OutOfMemory Exceptions. Each row translates into an object. Is there an easy way to find out the size of that object programmatically? Is there a reference that defines how large primitive types and object references are for a VM?
Right now, I have code that says read up to 32,000 rows, but I'd also like to have code that says read as many rows as possible until I've used 32MB of memory. Maybe that is a different question, but I'd still like to know.