[SI-LIST] : Output Driver Impedance Variation?

Bob Perlman ([email protected])
Tue, 21 Apr 1998 17:08:57 -0700

Hi -

Here's a question for all you CMOS process and output driver gurus. In
your experience, how much variation is there between best-case and
worst-case output driver impedance in a given state over process,
voltage, and temperature? If, for example, we have a totem-pole output
driver in a 0.35um CMOS process, and if the worst-case-driver output
impedance in the LOW state is 50 ohms, what's the best-case-driver
impedance? (By output impedance, I'm referring to the deltaV/deltaI
ratio in the region where the output transistor looks resistive, and not
where it's more like a current source.)

I've seen claims of worst-case/best-case ratios of anywhere from 1.5
(e.g., 50 ohms/33 ohms) to 3.5 (e.g., 50 ohms/14 ohms). Is there a
generic answer to this question that's at all useful, or is the answer
too heavily dependent on the process and implementation?

Thanks,
Bob Perlman
Acuson Corporation