Yes I recalled hearing various things about radiation effects on semiconductors over the years but the one I mostly think about
these days is just "bit flips" due to either cosmic rays or radiation intrinsic to natural radioactive particles inside the mechanics of the computer itself and occasionally there being a decay that produces something that's energetic enough to flip a bit.
I recall hearing the ceramic packages formerly commonly used around many chips (memories, CPUs, ...) were relatively much more problematic in that respect (ceramics) than maybe the more modern epoxy/plastic potting compounds are.
But anyway unless the radiation is enough to reprogram the non-volatile / OTP chip it'll be a soft-error that a reboot or ECC can fix.
Old UV-EPROMs were erased by a dose of UV shining through a glass/quartz window in the package so they could be reprogrammed,
though sunlight would eventually do the same thing after a few weeks, and x-rays would also. Those devices were basically
MOS capacitors that had charge trapped on them by basically zener like action of a "high" programming voltage and that stored charge
would stay there for many years normally unless optically erased.
So anything in NV/OTP chips of that nature (FLASH, EPROM, OTP) actually is vulnerable to "catastrophic" bit flipping by radiation or just slow decay with time (years) / temperature.
That is actually something that has often caused the "death" of old electronics over decade time scales,
the EPROMs / OTP / FLASH that once held critical program code / data or calibration / configuration settings is fine for a long time until
the bits flip and corrupt the memory devices at which point there's no fix unless one has a backup of the correct data and the means to
reprogram / replace the chip with a correctly programmed one but that's often impractical since such chips are often discontinued & unobtainable already for 10+ years before the problem arises.
Re: machines humming along for decades -- yes, that's actually one thing I'm nostalgic and upset about. We live in such a
"throw away" society and "planned obsolescence" is actually engineered deeply into the "quality" and architecture of modern
equipment. A lot of old electronics I've worked with from say the 1940s-1980s was actually incredibly well built and with just
occasional maintenance could be kept alive for decades. Even computing equipment from the 1980s-1990s quite often was
relatively robust beyond keeping the capacitors / corrosion / dust handled.
Now one is unlikely to find things with warranties exceeding 1-3 years, QC so bad there are often "bought new and it doesn't work" failures and then very marginal quality designs that are pretty likely to fail after a few years vs. decades.
It is tragic environmentally and also because look at it we basically literally personally have personal supercomputers by any standard which would have prevailed in the 1990s i.e.
"A 1480-processor Cray T3E-1200 was the first supercomputer to achieve a performance of more than 1 teraflops running a computational science application, in 1998" -- A single GPU I bought in ~2008 had 1TFlop performance and retailed for under $300.
So now we've got all this 1, 2, 3, 4 generation old computing equipment that goes into landfills, mostly unbroken, though a tragic lot of it
dying in the first handful of years in service and yet probably very potentially useful, and if architected for better expansion / reuse they'd
be a lot more so vs. just ending up trashed in 3-10 years.
these days is just "bit flips" due to either cosmic rays or radiation intrinsic to natural radioactive particles inside the mechanics of the computer itself and occasionally there being a decay that produces something that's energetic enough to flip a bit.
I recall hearing the ceramic packages formerly commonly used around many chips (memories, CPUs, ...) were relatively much more problematic in that respect (ceramics) than maybe the more modern epoxy/plastic potting compounds are.
But anyway unless the radiation is enough to reprogram the non-volatile / OTP chip it'll be a soft-error that a reboot or ECC can fix.
Old UV-EPROMs were erased by a dose of UV shining through a glass/quartz window in the package so they could be reprogrammed,
though sunlight would eventually do the same thing after a few weeks, and x-rays would also. Those devices were basically
MOS capacitors that had charge trapped on them by basically zener like action of a "high" programming voltage and that stored charge
would stay there for many years normally unless optically erased.
So anything in NV/OTP chips of that nature (FLASH, EPROM, OTP) actually is vulnerable to "catastrophic" bit flipping by radiation or just slow decay with time (years) / temperature.
That is actually something that has often caused the "death" of old electronics over decade time scales,
the EPROMs / OTP / FLASH that once held critical program code / data or calibration / configuration settings is fine for a long time until
the bits flip and corrupt the memory devices at which point there's no fix unless one has a backup of the correct data and the means to
reprogram / replace the chip with a correctly programmed one but that's often impractical since such chips are often discontinued & unobtainable already for 10+ years before the problem arises.
Re: machines humming along for decades -- yes, that's actually one thing I'm nostalgic and upset about. We live in such a
"throw away" society and "planned obsolescence" is actually engineered deeply into the "quality" and architecture of modern
equipment. A lot of old electronics I've worked with from say the 1940s-1980s was actually incredibly well built and with just
occasional maintenance could be kept alive for decades. Even computing equipment from the 1980s-1990s quite often was
relatively robust beyond keeping the capacitors / corrosion / dust handled.
Now one is unlikely to find things with warranties exceeding 1-3 years, QC so bad there are often "bought new and it doesn't work" failures and then very marginal quality designs that are pretty likely to fail after a few years vs. decades.
It is tragic environmentally and also because look at it we basically literally personally have personal supercomputers by any standard which would have prevailed in the 1990s i.e.
"A 1480-processor Cray T3E-1200 was the first supercomputer to achieve a performance of more than 1 teraflops running a computational science application, in 1998" -- A single GPU I bought in ~2008 had 1TFlop performance and retailed for under $300.
So now we've got all this 1, 2, 3, 4 generation old computing equipment that goes into landfills, mostly unbroken, though a tragic lot of it
dying in the first handful of years in service and yet probably very potentially useful, and if architected for better expansion / reuse they'd
be a lot more so vs. just ending up trashed in 3-10 years.
Originally posted by coder
View Post
Comment