"A Single Event Upset (SEU) is a random failure mechanism
characteristic of bombardment of single transistor memory
elements by natural radiation." Think of it as radiation randomly
causing a bit to flip. While this is not a problem for ground based
systems, it may be a concern for airborne equipments, as the electronic
systems are at altitudes which have lower protection for radiating
particles. This is partially due to the thinning atmosphere.
Devices of most concern are memory elements, such as
static rams, FIFOs, or dual-port memories. Other components
of concern are programmable logic devices, such
as SRAM-based FPGAs.
I realize this discussion is not relevant to Signal Integrity, but
since the question was asked as to what concerns there are in using
3.3V vs 5.0V, I thought I'd bring up this issue.
To answer the question I asked, a colleague contributed this answer:
"I don't think the interface logic has anything to do with SEU. First
of all, the geometry (which plays a huge factor in the susceptibility of
SEU) has no bearing on the interface logic (5V vs. 3.3V). For a given
geometry, the on-chip voltage rail will be constant. Therefore, the
geometry can be a double-whammy, as you decrease the geometry to the
threshold where you drop the rail voltage.
Geometry, doping, architecture (ie, layout), and
gate oxides are probably the most important factors. Voltage
rails on the interface logic is (as far as I can tell) has no bearing