In hard sci-if settings like The Expanse, would we expect that all of the electronics in space to be a few generations behind the state-of-the-art planetside to account for radiation hardening? Would the electronics be more likely to be installed inside of ships where presumably there is already radiation shielding for people (thinking sensor packages and hard points)?
I expect it'll be an ecosystem answer. Given that even personal devices might be expected to go to space or work in a dock or asteroid base, I'd expect most chips are defacto rad hard, probably by a combination of redundancy and lots of hardware accelerated software checks.
If the market demands space ready processors, and the military is primarily focused on space applications, I'd expect most of the best chips to support that radiation tolerant ecosystem.
If an environment is safe for humans, then it's safe for machines. Unfortunately this is a pretty high bar to clear and shows like the Expanse greatly undersell just how hard it will be to thrive on non-Earth locations in the solar system.
Machines can be a lot more fragile than humans though, mainly due to cost reasons. NAND flash memory chips now routinely code three or four symbols into each buried gate, which means that even a very slight disturbance can change its state. Without error correction (like most consumer hardware and software), it can cause real issues.
Not sure if I understood your comment correctly, but there is certainly not ECC in every CPU and main memory. The majority of RAM sticks you can buy do not support ECC. The NXP i.MX 8 application processors and Raspberry Pis that I was using do not support ECC memory. Also, all Cortex-M MCUs older than Cortex-M7 have no ECC in the core, and even with Cortex-M7 it is an optional extension used by STM32H7 but not NXP i.MX RT1160, 1060, 1050, etc
I don't know what you've done but that information is not correct. Standard iPads and ThinkPad laptops are used on ISS. The problem on ISS is not ionising radiation but that in zero gravity cooling by convection does not work anywhere near as well as on Earth. Without forced air circulation they just sit in a bubble of hot air.
I ran a bunch of code on standard servers lifted onto iss. Eventually we couldn't because the machines broke. I never said anything about radiation (but I was probably thinking that), just that this case refutes GPs claim that anywhere a human is safe, a computer is too.
I was wrong about ThinkPads, and I'm fine being wrong about that since ThinkPads are surprisingly robust.
Aren't humans already reasonable radiation hard? Cells are full of error correction. I would have thought with greater tolerance than an off the shelf chip.
I'm being rather vague with my defitions admittedly.
Yes, we are "reasonable radiation hard" optimised for the conditions on earth and the quite low radiation we get here.
But even so, cancer is a real threat, so increase radiation dosis and you will increase cancer rates. Along with long term dna damages and other perks.
There was the scene in the final season where Holden had to extract information from one of the bad guy belters satelites, and he had to open the box that held a computer. I've been wondering what computer chips from 250 years from now would look like... surely they hit a wall as far as miniaturization goes at least a century back. Anyways, radiation hardening even near cutting edge silicon probably comes easy, maybe because the materials they make the chips out of are radiation hardened themselves?
A dozen years ago that seemed like something absolutely crazy, but in 2023 it's looking as if, say, >50% market share in everything from microcontrollers to smartphones to automotive to servers to supercomputers could be reality within the next dozen years.
Mac/Windows personal PCs could be the only significant holdouts by even 2030 let alone 2035.
It might not happen, but it no longer seems insane to suggest it.
Is this enough to help with decommissioning damaged nuclear reactors? Early attempts to get some kind of remotely operated vehicle inside the Fukushima reactors failed. Radiation damaged the electronics too much within hours.[1] Recently there have been pictures from the inside from a new radiation-hardened remotely operated submersible.[2] (It's not looking good. The new pictures are the first ones from down near the base of the reactor, which had control rod machinery below it. Much of the concrete support is missing. All this is underwater, with seawater corrosion.)
TID was never the issue, there are COTS processors that cost 0.01x the price of rad hard components, and can withstand megarads easily. Embedded TMR is where it's at
For reference, the Hubble is running a "25 MHz Intel-based 80486" which is 34 years old.
"The DF-224 and its 386 co-processor were replaced by a 25 MHz Intel-based 80486 processor system during Servicing Mission 3A in 1999. The new computer is 20 times faster, with six times more memory, than the DF-224 it replaced. It increases throughput by moving some computing tasks from the ground to the spacecraft and saves money by allowing the use of modern programming languages."
Nobody is shipping those any more (okay, maybe not nobody, but very few if any). Gaisler's chips (GR740 = Sparc, NOEL-V = RISC-V) are pretty popular, the satellite manufacturer I work for uses them fairly ubiquitously.
The mars helicopter rover has a common mobile CPU, hardened a bit. The fact that aircraft has a lot of common consumer sensors and is still going strong is amazing (bosch IMU found in smartphone, OV camera, etc ... !)
It has 7 CPUs, the COTS SoC is just one of them. It's used for the vision tasks - hazard avoidance, odometry etc, and it's optional, as the helicopter is able to be controlled without it. It was specifically a test for the COTS components on Mars, and the vision system actually had multiple issues with frame skipping (I'm not aware of the direct cause though, might have been camera or the software for example).
If you want TMR and not have to commission an ASIC: Microblaze looks like a good option. For the quantities needed for experiments it would make much more financial sense too. High energy physics likes to play with technological toys though.
White rabbit as an extension of 1588 is a classic example. If they've negotiated a situation where they can can continue to do that: more power to them.
Microsemi (now part of Microchip) makes rad-hardened FPGAs, more so than Xilinx I think, and these days some of them (PolarFire SoC) include a penta-core 600-666 MHz 64 bit RISC-V processor: 4 Linux capable cores, plus one core without FPU, MMU, caches, and able to disable branch prediction, for real-time control purposes.
Nope. At least, it wouldn't help in the nasty parts. The first couple of robots they sent in saw up to 650 Sieverts/hr [1] (about 650 Grays/hr, or 650 Joules/kg/hr) [2]. It started to kill the video and other controls so they pulled them after a couple of hours. It's not just sensitive processors; all the electronics is affected. A 2018 IOP paper [3] shows the tolerence of recent robots. The best could handle 1000 Grays total.
Depends what kinda of radiation and where. If you get outside the Earth’s magnetic field Gérard K O’Neill said you needed six feet of soil equivalent to stop protons from the Sun for a space colony with a long-term safe environment.
It’s somewhat more complicated than that because there are very high energy cosmic rays that explode atoms when they hit and make showers of radioactive particles which are still going to affect you to some extent with the 6 foot shield, the effects might be worse if you added more shielding because the more shielding you have the more chances one of those heavy cosmic rays will blow up an atom in it.
i had always wondered how you'd get this to work at planet scale, but this idea of shielding from L1 is interesting as you wouldn't need planet sized radius. how do you power something at that level forever would be the $64k question. i'm guessing they'd park it similar to JWST where it's orbiting the Mars L1 point. man, these scientist types are smart ;-)
https://abopen.com/news/microchip-shows-off-rad-hardened-ris...
NASA also has a contract with Microchip and SiFive to develop Risc-V processor for space missions.
https://www.eejournal.com/article/nasa-recruits-microchip-si...