how to measure network and server latency
I'm capturing data from client and server port. i want to calculate network latency by deducting server and client side system process
The easiest way is: In every TCP Session the iRTT will be calculated. As it measures the time between the syn and the syn, ack. So we can call this the RTT or latency of the network for that session.
Thanks for Explation christian
this rtt also include the processing time taken by far end server who suppose to send the acknowledge. so let me put this way 1 i'm capturing from source 2 also capturing packet on destination
now what i wanted to know is how much network latency is contributing in total RTT.
example A-> send packet to B, B process the packet and sent ack to A. now A got ack from B with rtt which include data processing time taken by B. So rtt would not be my actual network latency. Network latency =total rtt minus data processing and ack time taken by B.
How we would get this with wirehshark. actual network latency.
Hello if you have captured the 3-way Handshake and the iRTT has been calculated you have more or less the network response time. As we assume that the only the stack is involved in generating the SYN,ACK we can assume that this is more or less the network round trip time. So the network delay will be half of the RTT. Otherwise the RTT by ACK measuring is sometimes harder as something like Delayed ACK and so on comes into play.
Asked: 2017-11-15 05:09:40 +0000
Seen: 30,893 times
Last updated: Nov 15 '17