The Effects Of Dispersion On Transmission Line Performance

Transmission line dispersion effects can be significant for telecom systems, military radar receivers with wide instantaneous bandwidth, ultra wide band wireless systems and millimeter-wave communication systems. The results of dispersion can be degraded system level performance. It can cause loss of signal integrity (poor eye performance), reduced bandwidth and degraded pulse performance.

The effects of dispersion can be seen by considering the line impedance of a microstrip line. As frequency increases, the line impedance increases. Figure 1 shows that for a 25 mil thick alumina substrate, the line impedance of a 25 mil line increases from 48 ohms to 73 ohms at 60GHz

dispersion1

Figure 1. Line impedance as a function of frequency with Er=9.8, h=0.025″, W=0.025″

The effects of dispersion can also be demonstrated by considering signal integrity which is important in the design of most any high-speed system. This is especially true for telecom and datacom systems which may be operating at many gigabits per second. Figure 2 below shows the effects of dispersion on eye quality of a 10Gb/s signal by comparing a 12.7mm (0.500 inch) length of coaxial line and microstrip line. It can be seen that the coaxial line has essentially no effect on eye quality. The microstrip line has overshoot and the eye closes early. In addition, the rise and fall times have been reduced.

dispersion2

Figure 2. The effects of dispersion on eye quality can be seen by comparing 12.7mm (0.5″) length of (a) coaxial transmission line and (b) 0.75mm thick alumina.

Consider the case of the 0.25mm thick alumina substrate. It has significantly less dispersion and as can be seen in Figure 3, the eye performance is commensurately improved. The improvement is due to use of a thinner substrate. A thinner substrate has less dispersion which is the same as saying that its phase is more linear (flatter group delay).

dispersion3

Figure 3. The effect of dispersion on eye quality for a 10Gb/S signal on a 0.25mm thick alumina substrate.

It has been shown that transmission line dispersion can have a negative effect on eye quality. A few general design considerations are given for reducing dispersion in microstrip as a guide to the designer.

Substrate Thickness Effects: The substrate thickness, h, effects dispersion. A thicker substrate causes more dispersion. From a simple consideration, this seems to follow from the fact that the inhomogeneity of the transmission line is increased with more dielectric. A more rigorous approach is to compare the substrate thickness to the wavelength in the material. As the ratio of substrate thickness to wavelength in the material is reduced, the dispersion is also reduced. Therefore as frequency is increased, the substrate thickness should be decreased to maintain a required level of linear phase (flat group delay response).

Substrate Dielectric Constant Effects: The substrate dielectric constant also effects dispersion and eye quality. For a given substrate thickness, a lower dielectric constant has less effect on eye performance. Again, this makes sense from the consideration of the inhomogeneous transmission line. The lower the dielectric constant, the closer it is to air which is the homogeneous (no dispersion) case. Also, a lower dielectric constant reduces the ratio of substrate thickness to wavelength.

Transmission Line Length: The effect of a dispersive transmission line can be reduced if the lengths of dispersive lines is minimized. This may seem to be obvious. However, the effect of long, dispersive transmission lines is often missed by circuit and board designers.

A fairly conservative design goal is to have the substrate thickness less than 5% of a wavelength in the material. The equation below gives the relationship.

dispersion4

This equation leads to the relationship for choosing the substrate thickness.

dispersion5

Where:
λ= wavelength in the dielectric material
vo = velocity(free space)
f = frequency
h = substrate height

For example, consider a 10Gb/S signal. As a general rule, the transmission lines should perform well to a frequency that is at least twice the frequency for a RZ (Return to Zero) signal. For 10Gb/S, this translates into a transmission line bandwidth of not less than 20GHz. For high purity alumina (er = 9.9), the substrate thickness using the above equation is 0.25mm.

The equation is fairly conservative and it is possible to achieve successful products with a ration of h/l as high as 0.15 or 0.2. However, the challenge will be to reduce line lengths and
Dispersion has been examined. Its effects on transmission lines and signal integrity have been shown. In addition, some general guidelines have been given for the proper design of transmission lines to minimize dispersion.

Leave a Reply

Your email address will not be published. Required fields are marked *