Ask Your Question
0

RTP Stream Analysis

asked 2023-12-05 17:41:33 +0000

DanShev gravatar image

I'm in search in understanding how Jitter and Delta is determined when analyzing a single outbound stream. More specifically how that math is being figured out. The RTP header timestamp is consistently incrementing 160 and has no variance. So I'm assuming that RTP header timestamp isn't being used in the calculations. I'm also just having a hard time figuring out how jitter is even being seen from a Wireshark capture of an outbound stream alone. I'm probably looking at it completely wrong. Any guidance would be appreciated.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2023-12-06 06:42:31 +0000

hugo.vanderkooij gravatar image

Jitter and delta is determined by the time stamps of the packets received. Some deviations are expected on the network. That why you have jitter buffers. If you put time and delta in the first 2 columns it is easy to follow how Wiresharks makes the calculation.

Take the sample PCAP file from https://wiki.wireshark.org/uploads/__... Set Colum 1 to be Number, Column 2 to be time (I prefer time of day as default) Column 3 to be frame.time_delta Column 4 to be rtp.timestamp

Then it pretty much explains itself.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2023-12-05 17:41:33 +0000

Seen: 344 times

Last updated: Dec 06 '23