Technical Insights > Benchtop

Is Jitter Good or Bad for Your Device Testing?

2019-11-12  |  7 min read 


High-speed data throughput is important in digital communications, and these high-bandwidth electronics require high-frequency clock signals. For the system to function correctly, the signal timing must be accurate. This means that the successive rising and falling edges of clock signals must happen at the proper time in each cycle. Errors in this timing is known as “jitter.” Jitter usually relates to the difference between the measured rising or falling edge of a clock and its ideal position in time at a given frequency.

Jitter is an undesired distortion of a signal. It occurs in the digital realm, and, rather than being an unwanted variation in amplitude of an analog signal, it pertains to the timing of digital pulses.

The real world is far from ideal. Designers have to overcome a variety of factors that influence transmitted data signal quality and identify jitter sources. In all cases, where there is jitter, the data stream as conveyed to the receiver will contain inaccurate information or have bit errors. If the jitter is very bad, it can cause system failure.

Jitter is the deviation from the true periodicity of a periodic signal. It is specified in time domain as period jitter and in frequency domain as phase noise.

  • Time domain as period jitter – Period jitter is the deviation in cycle time of a clock signal with respect to the ideal period over a number of randomly selected cycles (see Figure 1).

   Figure 1. Jitter in the time domain

  • Frequency domain as phase noise – In the frequency domain, phase noise measurements examine the spectrum of sideband noise frequencies in a clock signal. To keep things simple, assume that the ideal clock is a perfect sine wave with a frequency Fc. Such a pure clock will have all its power concentrated at Fc (see Figure 2). Phase noise spreads the power into sidebands, causing slight variations in frequency, so that instead of always producing a pure clock signal at Fc, the signal is sometimes a bit faster or slower. These small changes in clock speed translate into jitter in the time domain.

  Figure 2. Jitter in the frequency domain — phase noise

Low Jitter / Ideal Source for Testing

Designers need to test the limits of their designs to ensure that their products achieve ideal and maximum performance. This means they must generate a high-quality, known clean, stable, and reliable signal. The lower the jitter, the more stable the signal.

A low Jitter source lets engineers generate the exact signals they need. Better jitter performance allows engineers to place edges more accurately, reducing timing errors in their circuit designs. Low jitter enables users to get a more accurate result, improve signal quality, and better characterize a product.

A low jitter source provides accurate results for a device under test (DUT). Accuracy of the product under test is critical for technical and business success. Accurate testing ensures that you meet your product ’s true quality and performance goals. Accurate test measurements minimize risk and maximize business returns. Risk can mean a lost market window, product recalls, damaged reputation, and missed earnings. Accurate test equipment performance with low jitter ensures that your product meets design specifications.

Jitter in networking affects voice quality and the difference in communication packet inter-arrival time. TCP/IP is responsible for dealing with the jitter impact on communication signals. Too much jitter is an issue for voice traffic in a voice-over-IP (VoIP) network environment. When someone sends VoIP communications at a normal interval, those packets could get stuck somewhere in the network and not arrive at the expected destination. It is unusual, but the packets could take different routes or get load-balanced through two similar paths where one is congested at that moment. That is the jitter phenomenon. We can look at it as an anomaly in tempo: you expect the packet to arrive, but it takes longer than usual to get there. Having a lot of jitter in the network will increase the number of network communication delays.

High Jitter / Non-ideal Source for Testing

When you need to understand the limits of your design, the goal is not to simulate an ideal signal but to simulate a signal with quantitative, non-ideal characteristics.

Designers should add jitter to test their devices under non-ideal conditions. To do this, add real-world, non-ideal characteristics to the signal. You may need to add noise to your test signals for the robustness or immunity of your DUT.


A function generator that produces very low jitter signals and artificially generates controlled high jitter signals provides the best testing solution for your DUT. You need a source that produces a clean, low-distortion, stable, and reliable signal to characterize your product in terms of its best performance. You will also need to inject high jitter signals to test the robustness of your product.

Keysight's function generators with exclusive Trueform technology give you the confidence to produce the waveforms you need with best-in-class signal fidelity. The Keysight Trueform 33600A Series waveform generator offers a high degree of jitter stability at less than 1 ps. Having a stable signal source will provide tremendous benefits for characterizing your product.

To learn more about Keysight’s 33600A Series waveform function generator, please go to

To learn more about the jitter, please go to  “IQ Signal Generation Made Easy” :