I'm coding a tool, that, given any URL, would periodically fetch its output. The problem is that an output could be not a simple and lightweight HTML page (expected in most cases), but some heavy data stream (i.e. straight from /dev/urandom, possible DoS attack).
I'm using java.net.URL
+ java.net.URLConnection
, setting connection and read timeouts to 30sec. Currently input is being read by java.io.BufferedReader
, using readLine()
.
Possible solutions:
- Use
java.io.BufferedReader.read()
byte by byte, counting them and closing connection after limit has been reached. The problem is that an attacker may transmit one byte every 29sec, so that read/connection timeout would almost never occur (204800B * 29sec = 68 days) - Limit Thread execution to 1-5min and use
java.io.BufferedReader.readLine()
. Any problems here?
I feel like trying to reinvent the wheel and the solution is very straightforward, just doesn't come to my mind.
Thanks in advance.