I have a Tomcat instance which is exhibiting the following behaviour:
- Accept a single http incoming request.
- Issue one request to a backend server and get back about 400kb of XML.
- Pass through this XML and transform it into about 400kb of JSON.
- Return the JSON response.
The problem is that in the course of handling the 400k request my webapp generates about 100mb of garbage which fills up the Eden space and triggers a young generation collection.
I have tried to use the built in java hprof functionality to do allocation sites profiling but Tomcat didn't seem to start up properly with that in place. It is possible that I was just a bit impatient as I imagine memory allocation profiling has a high overhead and therefore tomcat startup might take a long time
What are the best tools to use to do java memory profiling of very young objects/garbage? I can't use heap dumps because the objects I'm interested in are garbage.