views:

440

answers:

7

I've been told that there is some overhead in using the Java try-catch mechanism. So, while it is necessary to put methods that throw checked exception within a try block to handle the possible exception, it is good practice performance-wise to limit the size of the try block to contain only those operations that could throw exceptions.

I'm not so sure that this is a sensible conclusion.

Consider the two implementations below of a function that processes a specified text file.

Even if it is true that the first one incurs some unnecessary overhead, I find it much easier to follow. It is less clear where exactly the exceptions come from just from looking at statements, but the comments clearly show which statements are responsible.

The second one is much longer and complicated than the first. In particular, the nice line-reading idiom of the first has to be mangled to fit the readLine call into a try block.

What is the best practice for handling exceptions in a funcion where multiple exceptions could be thrown in its definition?

This one contains all the processing code within the try block:

void processFile(File f)
{
  try
  {
    // construction of FileReader can throw FileNotFoundException
    BufferedReader in = new BufferedReader(new FileReader(f));

    // call of readLine can throw IOException
    String line;
    while ((line = in.readLine()) != null)
    {
      process(line);
    }
  }
  catch (FileNotFoundException ex)
  {
    handle(ex);
  }
  catch (IOException ex)
  {
    handle(ex);
  }
}

This one contains only the methods that throw exceptions within try blocks:

void processFile(File f)
{
  FileReader reader;
  try
  {
    reader = new FileReader(f);
  }
  catch (FileNotFoundException ex)
  {
    handle(ex);
    return;
  }

  BufferedReader in = new BufferedReader(reader);

  String line;
  while (true)
  {
    try
    {
      line = in.readLine();
    }
    catch (IOException ex)
    {
      handle(ex);
      break;
    }

    if (line == null)
    {
      break;
    }

    process(line);
  }
}
+1  A: 

there is very very little benefit to the 2nd method. after all if you can successfully open a file but not read from it, then there is something very wrong with your computer. thus knowing that the io exception came from the readLine() method is very rarely useful. also as you know, different exceptions are thrown for different problems anyway (FileNotFoundException, etc)

as long as you scope it with a 'logical' block, ie opening, reading, and closing a file in 1 go, i would go with the first method. it's much simpler to read and, especially when dealing with IO, the processor cycles used by the try-catch overhead would be minimal if any.

oedo
A: 

The second method will generate a compiler error that reader may not have been initialized. You can get around that by initializing it to null, but that just means you could get an NPE, and there's no advantage to that.

Matt McHenry
No such compiler error will be generated. The `return` statement at the end of the `catch` block makes sure that reader will be initialized or never read!Of course, I only added this statement after the compiler issued exactly the error you describe. :)
isme
Quite so -- I missed that. Thank goodness the compiler is here to figure this stuff out for us. :)
Matt McHenry
+5  A: 

I've been told that there is some overhead in using the Java try-catch mechanism.

Absolutely. And there's overhead to method calls, too. But you shouldn't put all your code in one method.

Not to toot the premature optimization horn, but the focus should be on ease of reading, organization, etc. Language constructs rarely impact performance as much as system organization and choice of algorithms.

To me, the first is easiest to read.

Jonathon
+1  A: 

Putting the try blocks around the specific code that may throw an exception, makes it, in my opinion easier to read. You're likely to want to display a different message for each error and provide instructions to the user, which will be different depending on where the error occurs.

However, the performance issue that most people refer to is related to raising the exception, not to the try block itself.

In other words, as long as you never have an error raised, the try block won't noticeably affect performance. You shouldn't consider a try block just another flow control construct and raise an error to branch through your code. That's what you want to avoid.

Marcus Adams
Just an additional note on your last paragraph. If you have a try/catch *inside* of a loop, you can catch an exception and continue looping. If your try/catch is *outside* of the loop, your looping will be aborted. Either one might be what you want and will influence where you put the try/catch.
Jonathon
+17  A: 
erickson
Exactly. It's also worth noting, for the record, that the overhead associated with try-catch only occurs if an exception is thrown or the block has a finally clause, as detailed in the VM spec http://java.sun.com/docs/books/jvms/second_edition/html/Compiling.doc.html#9934.
ig0774
@ig0774 So, if I add a finally block like `finally { in.close(); }`, then the associated overhead takes effect?From what I understand of the spec, the finally clause simply adds an extra instruction to the compiled try block to call the finally block as a subroutine before returning normally.
isme
+1 for the reference to the spec, btw.
isme
@isme: that single instruction is all I meant by "overhead" to the finally block; it is different from the try {} block which adds no instructions or specially handling on it's own.
ig0774
So the amount of ovearhead incurred by the try-catch-finally mechanism is proportional to the number of finally blocks. And there can be at most one finally block. So, in the worst case, the JVM executes one extra instruction which says "execute the finally block". In particular, the amount of overhead is in no way proportional to the size of the try block! Thanks, erickson and ig0774, for making this clear and simple to me!
isme
+2  A: 

No. The only thing that you should be considering is where you can reasonably handle the exception and what resources you need to reclaim (with finally).

CurtainDog
+1 for mentioning cleaning up resources. In the OP only the first implementation lends itself to adding a single `finally` clause that closes the file.
Kevin Brock
+1 for mentioning cleanup. Until now I'd never called `close` on any `Reader` objects I used. Shame on me!
isme
A: 

This is premature optimization at its worst. Don't do it.

"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil" - Knuth.

luis.espinal