RE: [SI-LIST] : Zo Choices & Why

About this list Date view Thread view Subject view Author view

From: Ingraham, Andrew ([email protected])
Date: Tue Jan 02 2001 - 13:20:35 PST


> My questions revolves around why, for instance:
>
> (1) 50-ohms is the choice for instruments?
>
> (2) 75-ohms is the choice for CATV/Satellite?
>
> (3) 100-ohms is the choice for Ethernet?
>
> (4) 28-ohms is the choice for Rambus?
>
> (5) Any other "standards" I've left out? (300-ohms for VHF, UHF?)
>
>
> I've read various things in the past explaining this but I have lost the
> link nor can find original posts describing this.
 
In the distant past, my link was to the Trompeter hardcopy catalog, which
included a few pages in the front with basic technical info. They now have
it on-line as an application note.

A lot of coax cable development work went on in the years before WWII. The
original reasons for 50, 75, and 93 ohms are rooted in coax development and
radio antennas. Since we continue to use some of these standards today, you
will find various other, more modern considerations that have now become
part of the folklore, even though they may have had nothing to do with those
original choices.

> I recall it being related to "What is the goal of the particular signal I
> wish to transmit?" For CATV, it is picture quality over distance and I
> think 75-ohms had the least amount of attenuation for the given
> cable. 50-ohms for intruments was deemed "quietest" or something like
> that.
 
According to the Trompeter note, 75 ohms was adopted as the most efficient
impedance, when considering the voltages, currents, and powers to be
transmitted. That probably means a compromise between high voltage
(dielectric breakdown) and high current (I*R losses) using common materials.
The telco and television industries standardized on 75 ohms.

50 ohms was adopted because it is the feed point impedance of a quarter-wave
vertical antenna over a ground plane, and much of the early electronics work
was in radio. The U.S. military standardized on 50 (or 52) ohms.

93 ohm cable was chosen for early instrumentation purposes because of its
lower capacitance.

I have also heard it said that 75 (or 73) ohms is the impedance at the
center of a thin half-wave dipole in free space. But I think this would not
have been important until the radio industry pushed toward UHF and
microwaves, well after 50 and 75 ohms had already been adopted as standards.

And I have seen arguments saying that 75 ohms is the optimum cable low-loss
point, or 50 ohms is optimal for maximum power handling, or vice-versa. If
you look around, you can probably find half a dozen conflicting reasons.

A folded dipole antenna has a feed point impedance of 4 * 75 = 300 ohms,
which may be why TV twin-lead uses 300 ohms. Also, 300 ohm twin-lead is
cheap and low-loss (until it gets wet).

A 50 or 75 ohm twisted pair is somewhat impractical, so 100 ohms works out
better, which may be why Ethernet uses it for UTP. Also, you can terminate
a 100 ohm balanced pair into two 50 ohm loads.

For short, heavily loaded buses, like Rambus, the lower the trace impedance,
the better: it reduces the impedance discontinuity caused by each device
hung on the bus. Whatever impedance they chose, may have been a compromise
between achieving acceptable switching rates, practical drive currents, and
routing density (trace widths).

**** To unsubscribe from si-list or si-list-digest: send e-mail to
[email protected]. In the BODY of message put: UNSUBSCRIBE
si-list or UNSUBSCRIBE si-list-digest, for more help, put HELP.
si-list archives are accessible at http://www.qsl.net/wb6tpu
****


About this list Date view Thread view Subject view Author view

This archive was generated by hypermail 2b29 : Tue May 08 2001 - 14:30:32 PDT