Measuring latency using timestamps.
I want to measure the latency of packets from client to server rather then the RTT. SO I send some tcp packet to server and running Wireshark at both the client and server. My thinking was that I can easily measure latency using timestamps but the Wireshark at the server side also shows the timestamp zero when the packet arrives at the server. I want it to be the same as in client( at client side the timestamp is zero make sense, but at the server side it should be time it take to travel from client to server). How can I solve that problem, if not what would be the other ways to find the latency. Thank you
What do you mean when you say that the timestamp is zero?
Time stamps in packet capture files are usually absolute times; in both pcap and pcapng capture files, which are native formats for Wireshark, they represent units of time since January 1, 1970, 00:00:00 UTC, so a numerically-zero pcap or pcapng timestamp means the packet was sent or received back in 1970.
Thanks for the response. The first SYN packet contain timestamp zero as shown below at both the client and server,
Exactly, I want timestamps value in UTC so that I can measure the latency of packet by subtracting the timestamps of packet at client and server side ( I am running Wireshark at both machines and also synchronize both with the common NTP server).