what is the optimal freeheap to totalheap ratio? At what values of this ratio should I consider increasing the heap size/ decreasing the heap size?
The ideal momentary ratio is 1. Ideally, your JVM would consume exactly the memory it required, no more and no less. That's a very hard target to reach ;)
The problem (as TNilsson points out) is that your application's memory requirements change over time as it does work, so you want it to have enough space not to cause constant collection/compaction more often than you can tolerate, and you want it to consume little enough space that you don't have to buy more RAM.
This probably depends on the rate that you allocate new objects. Garbage collection involves a lot of work tracing references from live objects. I have just been dealing with a situation where there was plenty of free memory (say 500 MB used, 500 MB free) but so much array allocation was happening that the JVM would spend 95% of its time doing GC. So don't forget about the runtime memory behaviour.
All those performance tuning articles that say something like "object allocation is really fast in Java" without mentioning that some allocations cause 1 second of GC time make me laugh.
There is no single easy answer, let me give you two examples:
Example 1 - Your program allocates 100M worth of memory at startup, and then does not allocate any memory what so ever for the rest of its run.
In this case, you clearly want to have a heap size of 100M (Well, perhaps 101 or something, but you get the point...) to avoid wasting space.
Example 2 - Your program allocates 10M of memory per second. None of the data is persisted longer than 1 second. (e.g. you are doing a calculation that requires a lot of temporary data, and will return a single integer when you are done...)
Knowing the exact numbers is perhaps not so realistic, but it's an example, so bare with me... Since you have 10M of "live" data, you will have to have at least 10M heap. Other than that, you need to check how your garbage collector works. Simplified, the time a GC takes to complete is O(live set), that is, the amount of "dead" data does not really enter into it. With a constant live set size, your GC time is constant no matter your heap size. This leads to larger heap -> Better throughput.
(Now, to really mess things up you add stuff like compaction of the heap and the image becomes even less clear...)
Conclusion It's a simplified version of the matter, but the short answer is - It depends.