Analysing the memory consumption of a Java application should not be done by using OS tooling, IMHO. A running JVM will allocate an amount of memory and is unlikely to release it, even if it is not actually needed by the JVM at a given point in time. An OS tool will only report the amount of memory that was allocated by the JVM, not the amount of memory that is actually committed within the JVM.
The tooling provided with the JDK (jstat, jconsole, jvisualvm) is much more reliable. When interpreting memory usage, the most important thing is the size of the actual heap. A Java application will typically display a sawtooth pattern. The amount of heap in use will rise gradually over time, and drop off sharply when GC frees heap space by removing all the objects that are no longer needed.
A definite warning signal is a sawtooth that is slowly ascending: the sharp drop caused by the GC is ending a little higher every time. If the application runs for a long time (typically for a server application), this will most probably cause an OutOfMemory error in the long run.
Another thing to wach for is a sawtooth where the 'teeth' are getting sharper and higher. This also indicates that the application will need more and more memory over time.
If you want to analyse the root casue of a perceived problem you need to analyse the number of objects created and have a look at how long they live. This is not trivial stuff.