I am currently gearing up for a +100MHz digital design that will operate at
3.3v. The 3.3v part is new to me. Does going from 5v to 3.3v significantly
change a high speed design approach? I'm thinking along the lines of
increased risk, schedule impact, cost, etc. This would be directly effected
by the technical differences between a 3.3v design vs. a 5v design. Also,
can anyone point to some good reading material or web sites?
As usual, your responses are both valued and appreciated.
/\___ /\___ /\___
__| | ___| | ___| | ___
\/ \/ \/
Honeywell, Space Systems Division, M/S 934-5
13350 US 19 N., Clearwater, FL, 34624