views:

100

answers:

2

As I understand it, the JVM has a limit of 64KB of compiled code per method. I have a tool which generates Java code to be run, and sometimes the generated code contains methods which are longer than this.

Does there exist an automated way of transforming a Java class file with overly long methods into one which produces the same results but which can be compiled?

In a simple example, the following code:

public void longMethod
{
    doSomething1();
    doSomething2();
    /* snip */
    doSomething20000();
}

might be transformed into:

public void longMethod
{
    longMethod_part1();
    longMethod_part2();
    /* snip */
    longMethod_part10();
}

public void longMethod_part1()
{
    doSomething1();
    /* snip */
    doSomething1000();
}

/* snip */

public void longMethod_part10()
{
    doSomething9001();
    /* snip */
    doSomething10000();
}

However, there are complications, e.g. the long method might be an extremely long if/else if chain. A best-efforts tool would be of interest even if the general case is too difficult.

EDIT: Several kind and well-meaning people have suggested fixing the tool which generates these long methods. While this is an excellent idea, it is not one I can take advantage of. I would still welcome any ideas for the general problem I pose above.

+1  A: 

The safe tools exist in Eclipse for example, but they are intented to be used by the developper. I think of the refactoring "Extract Method", that have nice properties:

  • fixes automatically the calling code
  • guarantee to be correct

I guess this does not correspond to your need, you would like something with no human action, don't you?

Maybe some tools build on that capability?...


I also support Joachim's proposal of fixing the tool :-)

The code could be simplified using:

  • for common sequences of calls, define them (even manually), and have the tool recognize them ; it will call them simply, and the result will be much shorter.
  • change the code sequence for a data sequence : instead of combining the method calls, have a code loop that reads data, and does the correct job for each data. The point is that data can come from any structure (file, Stream, database), it doesn't have to be code.


I like Steve's comment also. Maybe, if you can't fix the code, and you can't fix the tool, you have to fix the author.

Maybe it is possible to use the tool in a different way, that makes its result not that bad...

KLE
Fixing the tool is not really an option
Simon Nickerson
A: 

Clearly, if one had the choice, modifying the generating tool to break big classes apart would be the easiest choice. If that can't be accomplished, you need another answer.

The 64Kb limit is an arbitrary limit imposed by the Java compilers, not by Java itself. To solve the problem, you need a tool that can parse Java source code with the same level of accuracy as the Java compiler, estimate the amount of byte code produced, and transform large classes into those that don't violate the limits.

What you need is a Java source code analysis and transformation system. (A byte code analyzer obviously won't work because you can't get the compiler to produce byte code for files that exceed 64Kb).

The DMS Software Reengineering Toolkit could be configured to accomplish this task. It isn't an off the shelf solution (but your problem isn't standard either) but it is practical to process code accurately to do this. Configuring DMS to do this isn't an afternoon's work, though.

Ira Baxter