I would say it depends on what exact interval you are trying measure, the amount of time from the last byte of the request that you send until the first byte of the response that you receive? Or until the entire response is received? Or are you trying to measure the server-side time only?
If you're trying to measure the server side processing time only, you're going to have a difficult time factoring out the amount of time spent in network transit for your request to arrive and the response to return. Otherwise, since you're managing the request yourself through a Socket, you can measure the elapsed time between any two moments by checking the System clock and computing the difference. For example:
public void sendHttpRequest(byte[] requestData, Socket connection) {
long startTime = System.currentTimeMillis();
writeYourRequestData(connection.getOutputStream(), requestData);
byte[] responseData = readYourResponseData(connection.getInputStream());
long elapsedTime = System.currentTimeMillis() - startTime;
System.out.println("Total elapsed http request/response time in milliseconds: " + elapsedTime);
}
This code would measure the time from when you begin writing out your request to when you finish receiving the response, and print the result (assuming you have your specific read/write methods implemented).