Ask Your Question

Revision history [back]

RTP Stream Analysis

I'm in search in understanding how Jitter and Delta is determined when analyzing a single outbound stream. More specifically how that math is being figured out. The RTP header timestamp is consistently incrementing 160 and has no variance. So I'm assuming that RTP header timestamp isn't being used in the calculations. I'm also just having a hard time figuring out how jitter is even being seen from a Wireshark capture of an outbound stream alone. I'm probably looking at it completely wrong. Any guidance would be appreciated.