views:

244

answers:

3

Is there a way to limit the maximum buffer size to be read from an ObjectInputStream in java?

I want to stop the deserialization if it becomes clear that the Object in question is crafted maliciously huge.

Of course, there is ObjectInputStream.read(byte[] buf, int off, int len), but I do not want to suffer the performance penalty of allocating, say byte[1000000].

Am I missing something here?

+4  A: 

You write a FilterInputStream which will throw an exception if it discovers it has read more than a certain amount of data from its underlying stream.

Jon Skeet
Tanky you very much! Very nice wrapper!
SAL9000
+2  A: 

I can see two ways:
1) do your reads in a loop, grabbing chunks whose allocation size you're comfortable with, and exit and stop when you hit your limit; or 2) Allocate your max-size buffer once and re-use it for subsequent reads.

CPerkins
Thank you so much! I will use No. 1 and try to wrap it in Jon Skeets FilterInputStream suggestion.
SAL9000
You're welcome. **Jon Skeet**'s answer is superior, imo. They're equivalent from a "what happens" perspective, but `FilterInputStream` is much clearer and more standard. Less wheel-reinvention.
CPerkins
+1  A: 

Actually, there's a really easy way.

You can use NIO's ByteBuffer, and use the allocateDirect method. This method will allow you to allocate a memory-mapped file, so it doesn't have a huge overhead, and you can limit its size.

Then, instead of getting the stream from the socket, get the Channel.

Code:

    Socket s;

    ByteBuffer buffer = ByteBuffer.allocateDirect(10 * 1024 * 1024);
    s.getChannel().read(buffer);

Now, don't try to call the "array()" method on the byte buffer; it doesn't work on a directly-allocated buffer. However, you can wrap the buffer as an input stream and send it to the ObjectInputStream for further processing.

Aviad Ben Dov