views:

53

answers:

4

I'm requesting a webpage with sockets like this:

Socket sock = null;
PrintWriter out = null;
BufferedReader in = null;

try {
    sock = new Socket( "www.overvprojects.nl", 80 );
    out = new PrintWriter( sock.getOutputStream(), true );
    in = new BufferedReader( new InputStreamReader( sock.getInputStream() ) );
} catch ( UnknownHostException e ) {
    e.printStackTrace();
} catch ( IOException e ) {
e.printStackTrace();
}

out.println( "GET /ip.php HTTP/1.1" );
out.println( "Host: www.overvprojects.nl" );
out.println( "" );
out.flush();

String buf = "";

while ( buf != null )
{
    try {
        buf = in.readLine();
    } catch ( IOException e ) {
        e.printStackTrace();
    }

    if ( buf != null && Pattern.matches( "^(25[0-5]|2[0-4]\\d|[0-1]?\\d?\\d)(\\.(25[0-5]|2[0-4]\\d|[0-1]?\\d?\\d)){3}$", buf ) ) {
        ipText.setText( buf );

        break;
    }
}

try {
    in.close();
    out.close();
    sock.close();
} catch ( IOException e ) {
    e.printStackTrace();
}

However, the program seems to wait until the connection times out. I've debugged the loop and I'm absolutely sure break is called, yet the UI doesn't update until the connection is terminated. (This takes over 10 seconds.) After that the IP is visible, so I'm sure break has been called at some point. I've also tried using the website www.whatismyip.org instead and that does finish within two seconds.

+1  A: 

The activity of (re)painting the screen happens to be done by the same thread that is responsible for reading the contents of the reader object. This is a classic example of using multi-threading so that the screen rendering is not affected by the process of reading the response.

EDIT: Based on the comments received, the behavior can be explained by the fact that both the client and the server need to perform cleanup operations when the socket is abruptly closed. In simple terms, the client does not appear to be reading all data from the input stream, and hence it takes far longer than usual for the JVM and hence the OS to perform the appropriate operations that will actually release resources, resulting in an apparent lockup of some variety. So the advice provided by others of using URL/URLConnection is very valid in this case.

Vineet Reynolds
That is not the problem. The problem is that getting the webpage takes 20 seconds. I just looked at the network usage of the application and it seems like there's 15 second delay between the request and the response.
Overv
Just in case I haven't been clear, the network latency is the reason why must separate your screen rendering and event-dispatch activities from the logic that makes the network call and fetches the response.
Vineet Reynolds
By the way, I'm a bit flummoxed by your response. Are you trying to reduce the network latency, or are you trying to prevent the screen from locking up? Or is it something else?
Vineet Reynolds
If I use an URLConnection object to get the webpage, it takes 2 seconds. If I do it this way, it takes 20 seconds for the server to respond.
Overv
Ok, that nails it. Apparently, cleanup is not as easy as invoking close().
Vineet Reynolds
A: 

i might throw a break in the catch block after the read

Gabriel
+1  A: 

You need to check the content length header returned by the server. Only read that many bytes then close the connection after that.

You are waiting until the server closes the connection which is why it is taking 20 seconds.

Byron Whitlock
A: 

readLine is waiting for a newline which the server never sends

irreputable