I’m using a Java socket, connected to a server. If I send a HEADER http request, how can I measure the response time from the server? Must I use a provided java timer, or is there an easier way?
I’m looking for a short answer, I don’t want to use other protocols etc. Obviously do I neither want to have a solution that ties my application to a specific OS. Please people, IN-CODE solutions only.
Advertisement
Answer
I would say it depends on what exact interval you are trying measure, the amount of time from the last byte of the request that you send until the first byte of the response that you receive? Or until the entire response is received? Or are you trying to measure the server-side time only?
If you’re trying to measure the server side processing time only, you’re going to have a difficult time factoring out the amount of time spent in network transit for your request to arrive and the response to return. Otherwise, since you’re managing the request yourself through a Socket, you can measure the elapsed time between any two moments by checking the System timer and computing the difference. For example:
public void sendHttpRequest(byte[] requestData, Socket connection) { long startTime = System.nanoTime(); writeYourRequestData(connection.getOutputStream(), requestData); byte[] responseData = readYourResponseData(connection.getInputStream()); long elapsedTime = System.nanoTime() - startTime; System.out.println("Total elapsed http request/response time in nanoseconds: " + elapsedTime); }
This code would measure the time from when you begin writing out your request to when you finish receiving the response, and print the result (assuming you have your specific read/write methods implemented).