Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Measuring latency using timestamps.

I want to measure the latency of packets from client to server rather then the RTT. SO I send some tcp packet to server and running Wireshark at both the client and server. My thinking was that I can easily measure latency using timestamps but the Wireshark at the server side also shows the timestamp zero when the packet arrives at the server. I want it to be the same as in client( at client side the timestamp is zero make sense, but at the server side it should be time it take to travel from client to server). How can I solve that problem, if not what would be the other ways to find the latency. Thank you

click to hide/show revision 2
retagged

Measuring latency using timestamps.

I want to measure the latency of packets from client to server rather then the RTT. SO I send some tcp packet to server and running Wireshark at both the client and server. My thinking was that I can easily measure latency using timestamps but the Wireshark at the server side also shows the timestamp zero when the packet arrives at the server. I want it to be the same as in client( at client side the timestamp is zero make sense, but at the server side it should be time it take to travel from client to server). How can I solve that problem, if not what would be the other ways to find the latency. Thank you