> How much drive strength variation can one expect between best-case and
> worst-case? Since no-one else said anything, I'll take a stab at it.
> I have heard that semi. vendors prefer a spec with no less than about a
> 2.5:1 ratio.
Since I have been designing I/O drivers for automotive product over past
several years I may be able to help here. When designing for lower EMI I
try to maximize the rise and fall times and meet functional spec at max
freq. This implies that that to meet this criteria one must design for WCS
and consider the BCS results. Now WCS can be different for different
designs dependant on the application. In automotive the enviornments are
very harsh so our temperature ranges from -40 deg C to +150 deg C. The
power supply varies by ~10% however some applications require a 5 volt
output and others require a 3.3 volt output. Unless the frequency at which
the output driver can operate is below 10 mHz the output driver strength
is usually set by the rise and fall time requirements. If I looked at the
rise and fall time differences over Temp,PS and Process in the automotive
enviornment my WCS to BCS ratio could range from 2.5:1 -> 3.5:1. On the
other hand the WSC to BCS ratio for Ioh and Iol over Temp., PS and process
could range from 3:1 -> 4.5:1.
It is true that process changes during the lifetime of the product can
affect the strength of the output drivers. These changes in process usually
shift the WCS and BCS results to meet a higher frequency of operation.
Depending on the type of process change a change in WCS:BCS ratio may not
be affected. If you looked at the WCS:BCS ratio over the lifetime of the
product there would most likely be differences in WCS:BCS ratios.