Jitter is a kind of distortion of digital signals that takes the form of phase shifts over a transmission medium. Jitter can be thought of as the standard deviation of latency. If latency is constant, then no additional buffering will be necessary for voice or video streams after the initial startup. If jitter is present, buffers must be sized to accommodate the greatest delay, and the initial startup must compensate for the longest latency. See latency.