> LLNL’s experiment surpassed the fusion threshold by delivering 2.05 megajoules (MJ) of energy to the target, resulting in 3.15 MJ of fusion energy output, demonstrating for the first time a most fundamental science basis for inertial fusion energy (IFE)
Yesterday, everyone was complaining about the 2.2:2.0 ratio, but now we're working with 3.15:2.05.
With modern lasers, that'd be a total Q of 0.375 assuming 100% efficiency through direct-energy-capture.
The jumps to get here included
- 40% with the new targets
- 60% with magnetic confinement
- 35% with crycooling of the target
The recent NIF experiments have jumped up in power. The first shot that started this new chain of research was about 1.7 MJ of energy delivered. Now, 2.15 MJ. However, the output has jumped non-linearly, demonstrating the scaling laws at work.
> I’ve helped to secure the highest ever authorization of over $624 million this year in the National Defense Authorization Act for the ICF program to build on this amazing breakthrough.”
It's nice to see this milestone recognized, even if the funding it still rather small.
I guess we should take this as a lesson in communications. The "breakeven" thing is a red-herring that should have been have been left out of the message, or at least only mentioned as a footnote. The critical ELI5 message that should have been presented is that they used a laser to create some tiny amount of fusion. But we have been able to do that for a while now. The important thing is that they were then able to use the heat and pressure of the laser generated fusion to create even more fusion. A tiny amount of fusion creates even more fusion, a positive feedback loop. The secondary fusion is still small, but it is more than the tiny amount of laser generated fusion. The gain is greater than one. That's the important message. And for the future, the important takeaway is that the next step is to take the tiny amount of laser fusion to create a small amount of fusion, and that small amount of fusion to create a medium amount of fusion. And eventually scale it up enough that you have a large amount of fusion, but controlled, and not a gigantic amount of fusion that you have in thermonuclear weapons, or the ginormous fusion of the sun.
To give some more detail: hydrogen to helium fusion (even with intermediate steps) is extremely unlikely to happen. That's part of why the sun will last for billions of years. And that's also why first human attempts at fusion are not trying to use straight up hydrogen as the fuel.
Good old Wikipedia has this gem:
> The large power output of the Sun is mainly due to the huge size and density of its core (compared to Earth and objects on Earth), with only a fairly small amount of power being generated per cubic metre. Theoretical models of the Sun's interior indicate a maximum power density, or energy production, of approximately 276.5 watts per cubic metre at the center of the core,[63] which is about the same power density inside a compost pile.
Another fun fact: there's a decades old design for a gadget that fits at the top of your desk and does nuclear fusion. You could build one yourself, if you are sufficiently dedicated. Unfortunately, no one has ever worked out how to run one of them as a power plant. Ie how to get more useful energy out than you have to put in.
If it produced a quarter of the heat of the human body per volume, its temperature would be lower as well (less than 37 degrees Celsius).[1] This is obviously not the case.
[1] Obviously heat and temperature are not the same, I know that. But when something’s temperature is higher than another thing’s, then heat is exchanged along that gradient. Meaning if the sun produced less volumetric heat than the human body, a human body placed within the sun would warm the sun and cool the human.
For both the sun and a human on earth there are two processes going on:
1. Heat production per unit volume.
2. Heat loss per unit surface area.
The volume to surface area ratio for the sun is much larger than for the human, for a minor reason (the sun is a sphere) and a major reason (the sun's linear size is much bigger). So the equilibrium temperature of the sun in the same ambient outside environment is higher than the human's.
Your thought experiment about placing a human inside the sun would in fact work as you say, if a human body continued to produce heat once it had achieved thermal equilibrium with the surrounding plasma.
Yeah, this is a morbid analogy but if you got a bunch of people enclosed in a small area, the people in the middle will get so hot they will heatstroke, even if it's freezing outside. See the recent Korean crushing disaster.
Don't give them ideas, harnessing people power would solve all the major problems. Overpopulation, global warming, energy crisis,... Reminds me of that 'Mitchell and Webb - Kill all the poor' sketch.
Well, plant the idea that there is a fusion reaction going on in mitochondria at a very low scale. Throw in terms like "proton gradient", "electron transport chain" and create a science conspiracy.
Can you say more about the ways in which fusion reactors need to surpass stars, and why people believe it's feasible that we can sufficiently get to that point?
(Also - thanks for sharing one of the most interesting comments I've read on the internet in quite a while.)
That, but also mimicing the pressure of the sun here is just not possible (yet? :D), why we need to play with higher temperaturs with different problematic consequences.
> The highest instantaneous pressures we can obtain here on Earth are in the Fusion reactor at the National Ignition Facility and in Thermonuclear weapon detonations. These achieve pressures of 5 x 10^12 and 6.5 x 10^15 Pascal respectively. For comparison, the pressure inside our Sun’s core is 2.5 x 10^16 Pascal.
Total power radiated by a black body per unit surface area scales as T^4 (in Kelvin).
So for black bodies with identical shape and linear dimensions R1 and R2, with identical power production per unit volume, both in thermal equilibrium with whatever is outside them, you would expect:
R1/R2 = (T1/T2)^4
(because setting power produced equal to power radiated gives R proportional to T^4).
Pretending humans are spheres with radius 1m and the sun is a sphere with radius 7*10^8m, you would expect the sun to have ~160 times the temperature of a human at equilibrium in vacuum. It's going to be lower because not all of the sun is power-producing, of course. But higher because a human is not 1m in radius. And again higher because humans are not spheres and lose heat more than a sphere would for the same volume (more surface area).
The sun is about 6000K on the surface. That would give us ~40K for the equilibrium temperature of a human in vacuum, which at least seems truthy.
TL;DR: the sun is big, with a small surface area compared to its volume, because it's big.
and not a gigantic amount of fusion that you have in thermonuclear weapons,
Ironically, this experiment was designed primarily to simulate the fusion you have in thermonuclear weapons. That's the NIF's purpose and the purpose of this experiment. From Nature https://www.nature.com/articles/d41586-022-04440-7
"Herrmann acknowledges as much, saying that there are many steps on the path to laser fusion energy. “NIF was not designed to be efficient,” he says. “It was designed to be the biggest laser we could possibly build to give us the data we need for the [nuclear] stockpile research programme.”
How would it help build better weapons? I thought the existing thermonuclear bombs do the job perfectly fine. Sure, the military might want to make them a bit smaller or a bit cheaper, but is it really such a big deal as to warrant a major announcement?
"Nuclear stockpile research" is only making weapons "better" in the sense that they are reliable despite not having been tested in decades. There's probably a component of "unemployed nuclear weapons designers is a bad thing" also.
The fusion for power experiments are using the same laser equipment but different targets and sensors.
I guess they also don't want to blow up random civilians by accident again.
When castle bravo was tested, we didn't knew that lithium7 fusion was possible and that it would generate energy. The bomb had a lot of lithium7 because it was cheaper than lithium6. Castle Bravo then proceeded to explode with way more power than intended, it vaporized the measurement instruments, ruined the test site, damaged civilian property and caused a horrible amount of fallout that screwed a enormous amount of people from more than one country.
Even during war, I suppose you want your explosions to behave in the way you expect... so you need to figure out all the physics related to them.
Given the limitations on nuclear research for weapons purposes any information that can be gleaned from these experiments that is 'dual use' is more than welcome with the parties that are currently stymied by various arms control agreements. This is also why you will see a lot of supercomputer capacity near such research, it allows simulation of experiments with high fidelity rather than the experiments themselves. These are all exploitation of loopholes. The biggest value is probably in being able to confirm that the various computer models accurately predict the experimental outcomes. This when confirmed at scale will allow for the computer models to be used for different applications (ie: weapons research) with a higher level of confidence.
Presumably these computer models are mostly useful for creating new designs (since the old designs were proven by real tests). Would such new designs be convincing enough to the adversary to fulfill its role as a strategic deterrence?
When (in XX years?) almost all US nukes are only simulated on computers and not actually tested, the Russians may start wondering if the US aresnal actually works, no? That would be a horrible outcome, since it means the Russians would be taking somewhat greater risks in their decision-making. Wouldn't far outweigh any opertaional or financial benefits the newer designs offer?
I suppose one could argue that if the loss of confidence in strategic weapons matched the actual loss in reliability, it might be a "no op" (although even this is arguable). But if the Russians think the US simulations suck, while the US is actually building really good simulations, the loss of confidence would be greater than the actual loss in reliability. In the extreme case, the nukes work great, but everyone thinks they are scrap metal.
Of course, the same happens in reverse: if the Russians are upgrading their weapons to untested designs, the US may start underestimating the risk.
> the Russians may start wondering if the US aresnal actually works, no?
If anything the last year or so has probably made the reverse happening and the US and its adversaries likely both have very high confidence in that the US arsenal actually works.
Or the US finds that after x many years bombs degrade in unexpected ways, and that while we were able to figure this out and fix it. Then we speculate that the Russians probably haven't fixed theirs in the same way and their bombs aren't good anymore. Which means the risk of a nuclear war just jumps up, since MAD is compromised.
To learn about their degradation modes and how to maintain them (since full-scale nuclear tests are now verboten, and subcritical experiments only deal with fission part of the entire assembly -- we now also need some experimental setup to test the fusion part: radiation pressure, X-ray reflection, ablation modes etc. NIF is this setup).
Also, there is a constant need to improve fusion/fission rate in the total energy output, and perhaps eventually design pure fusion weapons, though this is still probably out of reach.
The USA no longer has the technical capability to manufacture the necessary parts to maintain the current stockpile. The whole strategic arms reduction treaty regime is basically a fig leaf to cover for the fact that we have to cannibalize some legacy weapons to maintain the rest. If nothing changes the USA probably won't have an effective arsenal within a century. Given the likely state of the USA by that time, that's probably for the best.
> > USA probably won't have an effective arsenal within a century.
> If other countries joined, it would be a great outcome.
Why the optimism? Without MAD, it's nearly certain that we'd have a world war at some point in time. Sooner or later, it will surely happen. If you think it won't happen, or won't cost millions of lives, or won't employ re-developed nukes eventually, please tell me why you think so. (No sarcasm.)
I don’t think MAD is all that effective. At this moment in time we’re seeing:
* Several concurrent arms races in the Middle East, Asia, and Europe
* A high intensity conflict in Ukraine
* China threatening a land invasion into Taiwan
MAD might be preventing a country like Poland from jumping into the Ukraine conflict, but more likely it’s because of its involvement in NATO.
I think collective security organisations are a far more potent force for peace than nuclear weapons. If countries abided by their security agreements in WW2, then we’d have nipped the entire thing in the bud.
> I don’t think MAD is all that effective. At this moment in time we’re seeing:
> * Several concurrent arms races in the Middle East, Asia, and Europe
> * A high intensity conflict in Ukraine
> * China threatening a land invasion into Taiwan
I mean... Only one of those is an actual fight. And there MAD doesn't apply because the defender doesn't have the Assured Destruction capability needed.
> Several concurrent arms races in the Middle East, Asia, and Europe
Which is to say, possible future proxy wars between the great powers where MAD will supposedly restrict conflict intensity. See below.
> A high intensity conflict in Ukraine
What's going on in Ukraine is a bog standard cold war style proxy war. The NATO plan is basically to turn it into another Afghanistan for the Russians. It's the exact thing that MAD is meant to keep from spilling over into a world war between the principals.
> China threatening a land invasion into Taiwan
This is more interesting. US conventional forces almost certainly have no hope of beating China that close to home. Therefore, any effective US response would require nuking China and China is presumably deterring that with their nukes. There is an argument to be made here that a non-nuclear Chinese military would be in Taiwan's best interests. However, I see no scenario where either a nuclear or non-nuclear China and a non-nuclear USA is in Taiwan's best interests. So while the MAD case isn't the best case for Taiwan here, it's also not the worst.
There are a lot more armed conflicts throughout the world that are proxies or partial proxies between the various powers in the world (US, Russia, and more). See Yemen, Syria, and many more throughout Africa. World powers tend to get on one side or another as they see their interests align and oftentimes this prolongs the conflict rather than bring it to any resolution.
I'm not really optimistic about this option, just a bit of wishful thinking about peace upon the world and such. OTOH, even with MAD we still have wars and probably much more than 100K people on average get killed in wars every year. Even with MAD, NATO is pushing conflict with Russia well beyond a proxy war at this point. MAD doesn't work if the world goes mad...
> 1. How does this research help address this problem?
I see it the other way around, this problem makes me doubt that this research will ever actually lead anywhere, supposing it's even as good a result as it first appears.
> 2. What are the sources for your opinion?
It's a fact. And my source is dead tree media. I don't recall all the details, but there are some very finicky parts that go into a state of the art warhead and we have lost the capability to manufacture them. Is this really so surprising? We can't even build new F-22s anymore!
The most famous key component that was speculated or leaked, was that we lost the makeup and reason for the Styrofoam that holds the primary and surrounds the secondary.
This research successfully initiated fusion, using a capsule of hydrogen made of some material, surrounded by something, with an outer layer. This outer layer is turned into X-Rays by the laser, which then ablate the hydrogen capsule's casing casing the inwards pressure. You could speculate, that they just found the makeup for something that would replace the Styrofoam, or we just improved upon it.
From what I can tell this is mostly a useful cover story as a fallback for the moon-shot type energy potential. But are there serious downsides to having this be the case? Does it significantly distract away from the pursuit of energy? Or somehow fundamentally constrain the projects scope?
Exactly the opposite: their task, assigned by Congress, is weapons work. Period.
But pretending to work on "carbon-free energy" is good for funding, just now. Four years ago, being all about weapons opened the tap.
Make no mistake, there is no story here. There will be no "unlimited free energy" from this, or any other fusion project. The fusion startups are spending down investors' money with zero possibility of payback (Helion conceivably excepted), because if they did get Q>1, they have no workable way to harness it. ITER will not even try to produce one watt-second of electricity. Its follow-on demo reactor won't start building until 2050, if not further delayed.
We know the way to get unlimited free energy: solar. Build more, get more. It doesn't have bomb scientists making inflated claims; it just works, and better every year.
> The fusion startups are spending down investors' money with zero possibility of payback (Helion conceivably excepted), because if they did get Q>1, they have no workable way to harness it. ITER will not even try to produce one watt-second of electricity. Its follow-on demo reactor won't start building until 2050, if not further delayed.
You seem to be awfully certain of that. I'm not an expert in this area, but my understanding is that the MIT Arc reactor is planned to use FLiBe as a liquid coolant that absorbs heat/neutrons/etc from the fusion reaction and is pumped into heat exchangers to boil water to run turbines. I mean, maybe there's some details not worked out and maybe I'm misunderstanding how it works, but it seems like a plan to generate electricity to me.
There's no plan to hook ITER up to a thermal plant because it's a research reactor not a power plant, but there's no conceptual reason they couldn't do it. (Not that ITER is a great example; the design is already antiquated before it's even finished.)
Driving steam turbines, even with other costs at zero, leaves you uncompetitive with renewables. But other costs would be very, very far from zero. Extracting the grams of tritium at PPB concentration dissolved in 1000 tons of FLiBe every day so you have fuel for tomorrow is an expensive job all by itself.
Making a whole new reactor every year or two because it destroyed itself with neutron bombardment is another.
You seem to forget that storage also has no operating expense.
The cost of operating a steam turbine far exceeds the cost of the coal or uranium driving it. But the steam turbine would not be the only operating expense for fusion. We don't know exactly what it would cost to sieve a thousand tons of molten FLiBe every day to get out the tritium produced that day, because no one even knows any way to achieve it at all. But it would certainly be a huge daily expense, if achieved.
Combined-cycle gas turbines are fine for backing up renewables and storage. They are used for that today. As the amount of renewables and then storage built out increases, the amount of time the gas turbines must run, and thus total operating expense, declines.
It should be clear that to build out storage when there is not surplus renewable generating capacity to "charge" it from would be foolish. The immediate exception is to time-shift renewable energy generated at midday peak for evening delivery, as is being done successfully today.
Steam turbines, by contrast, are expensive to operate, and slow to start up and shut down.
Capital expense of renewables is very low already, and still falling. Even substantial overbuild to charge storage from does not change this. Cost of various forms of storage is falling even faster. By the time much storage is needed, it will be very cheap.
> Combined-cycle gas turbines are fine for backing up renewables and storage.
So are plain steam turbines, if you have cheap steam.
> Steam turbines, by contrast, are expensive to operate, and slow to start up and shut down.
Huh? Combined cycle setups use steam turbines as part of the system. Steam turbines can ramp up and down plenty fast. It's traditional heat sources that don't ramp well.
> By the time much storage is needed, it will be very cheap.
That would be nice but I'm not depending on it, and I'm definitely not going to assume that long term storage will ever be cheaper than steam turbines.
> There's no plan to hook ITER up to a thermal plant because it's a research reactor not a power plant
Interestingly, that's not really true: IIRC the japanese team working on the WCCB breeder module (that uses supercritical water as coolant) plans on connecting the water loop to a small turbine. If they succeed it would be the first ever electrical power produced from fusion.
At least ITER has a plausible, in a sci-fi story sense, way of creating a sustainable fusion reaction to which more fuel could be given. The fact the neutrons produced, if it did reach the high Q ratios, would irradiate and break down everything involved, is probably 2070's scientists and engineers problems to solve.
Why won't fusion still be a worthwhile goal even if we do have abundant solar/wind?
Abundant and cheap are relative terms. Solar and wind could be abundant in a "powers everything we have now and for the foreseeable future of population growth" sense, but maybe not in a "gigantic power-hungry megaprojects which aren't remotely possible today" sense?
Most likely, spacecraft will rely on power delivered via laser.
If anybody succeeds in working out D-3He fusion, that could work in a spacecraft. (D-T, no.) We could probably scare up enough 3He to use for that, if there weren't too many.
If anybody in the universe is doing interstellar travel I think they would have developed D-D fusion which is somewhat more difficult than D-³He or D-T fusion but probably possible with the a scaled up version of the same machine.
Outside the frost line there is a lot of water and a higher percentage of D relative to H so it seems possible to "live off the land" between the stars without being dependent on starshine. A D-D reactor would produce ³He and T, a lot of those products would burn up in the reactor because the reaction rates are high but it would probably be possible to separate some of those out and use it as a breeder reactor that makes fuel for D-³He and D-T reactors elsewhere. I could picture the big D-D reactor running on a large comet or dwarf planet like Pluto producing D-³He for smaller reactors on spacecraft. (D-T not only produces a lot of neutrons but the T has a half life of 12 or so years and won't last for long journies.)
My guess is that interstellar travelers would develop a lifestyle that works around the frost line, where generic bodies above a certain size have liquid water inside. If they were grabby they might consume Ceres or Pluto but might not really care about dry, idiosyncratic worlds like the Earth and Mars.
Anybody doing interstellar travel should hang their collective head-analog in shame if they haven't mastered aneutronic p-11B fusion yet. (They will need to have figured out how to reflect xrays.)
Having got used to spending interminable ages out in the infinite chill void, they probably have come to prefer being there, so have no desire to roast deep in a stellar gravity well. Their equipment might not even work if warmed too much.
I think parent was talking about space applications.
Anyway unlike fusion, seasonal thermal storage is viable and available now, and will be scaled up in immediate future. Also, with electrical vehicles inducing massive investment into the grid, there will be both pressure and resources to solve the rest.
Since we're talking in present tense that's the remaining 1%.
Moon base can be fine with power beamed from a satellite or plain mirrors in orbit, no atmosphere in the way. Might end up being still cheaper than hauling nuclear reactor there plus all the infra to reliably dump waste heat from it.
It works super-great, collected in the tropics and shipped in chemical form. Before you object to depending on imported liquid fuel, consider that most of the world does already.
The main difference is that literally anybody can make it, not just "oil exporting countries" and "fuel refiners". And, will. And export excess production when local tankage is full.
I do software for a living so that gives you my level of ignorance of the real world.
I do have two question about solar
- is it “drivable” / “pilotable” ?
meaning reacting to surge in the grid? My understanding is that this feature is highly desirable for a grid.
- can we actually build enough solar panel, physically ?
Don’t we need some rare earth thingy that is not in sufficient quantity on our planet as far as we know ? ( follow up : if there is enough, will there be enough in 200 years ? )
The "rare-earths thingy" is a common, transparent lie told frequently about both solar and wind. No rare-earths are used in any solar panel. Some wind turbine generators have used them, but not the big ones. And, "rare-earths" are not in fact rare. So, a double lie.
Solar panels provide cheap power generation on a schedule. For dispatchability, you rely on storage. There are many different kinds of practical, efficient storage; which are used where will depend on local conditions. Which will be cheapest isn't clear, but probably not batteries. Batteries used won't need lithium, or rare-earths, either
The lie most frequently repeated is that storage needs some sort of "breakthrough". Second is that the small amount built out means more than that there is not enough renewable power yet to charge it from; when there is will be time to build it. In the meantime, we fill in with NG burning. The third is that "pumped hydro", the most common used just now, needs "special geography". Hills are very common.
The lie most frequently repeated about solar is that there is any shortage of places to put it. It is most efficiently floated on water reservoirs, where it cuts evaporation and biofouling, although efficiency is only one consideration. It shares nicely with pasture and even crop land, cutting water demand and heat stress without reducing yield.
There will never be any shortage of wind or solar: need more, build more; materials needed are all abundant. Likewise storage. Costs are still falling as fast as ever, but are already lowest of any energy source ever known.
They are in ores, but are mixed with other lanthanides that they are expensive to separate from. Two of them, yttrium and scandium, are not lanthanides and are relatively easy to separate out.
A new powerfully magnetic iron-nickel allotrope may eliminate much of the market for several of them.
In regards to your first question, the word you're looking for in google-able energy industry jargon is "dispatchable". And yes, dispatchability of intermittent generation is achieved in a couple of ways in contemporary electricity networks:
1. Deliberately backing off wind or solar generation from full capacity to provide reserves for demand spikes, transmission/generator outages, etc. This means other generation that may otherwise not have generated at all over that period, is brought online to cover the shortfall.
2. Co-locating grid-scale batteries at intermittent generation sites ("hybrid generation facilities" in energy industry jargon) to cover short-term contingency events.
Thank you, not my industry and not my language so “dispatchable” is a valuable keyword for me. ( it would be “pilotable” in French; if you ever have to discuss that abroad with my snotty kind )
Anyway. What I read is : having something else on the side can make solar dispatchable.
Realistically, what would be that other things ?
Nuclear don’t like to be turned on/off.
Wind has the same issue… are we saying the good ol’ coal burning kettle ?
If they can get a continuous fusion reaction going converting the heat energy from that to electricity won't be a problem. Getting a contained fusion reaction that gives out more energy than input is the problem how to convert that into electricity is not going to be a problem.
> If they can get a continuous fusion reaction going converting the heat energy from that to electricity won't be a problem.
Even accepting the qualification that's not just a mere matter of engineering, capturing that heat from a source that hot is not without trouble. A bit like how there is plenty of energy in a single lightning strike and yet we can't easily catch it even though in principle 'just build a large enough capacitor and connect it to a lightning rod' is a workable recipe.
> Getting a contained fusion reaction that gives out more energy than input is the problem
Not in the least because the container itself is a very hard problem to solve.
> how to convert that into electricity is not going to be a problem.
It is also a problem, albeit a lesser one.
The better way to look at all of these fusion projects is a way to do an end run around arms control limitations with as a very unlikely by-product the possible future generation of energy. But I would not hold my breath for that. Meanwhile, I'm all for capturing more of the energy output by that other fusion reactor that we all have access to, and learning how to store it over longer periods. Preferably to start with a couple of days with something that doesn't degrade (think very high density super capacitor rather than a battery), but I'll take advanced battery technology if it can be done cheap enough per storage cycle. We're getting there.
It doesn't make heat. It makes fast neutrons. Turning them into usable heat is a project of its own.
Turning dumb heat into electric power is expensive. Nothing that depends on doing that can ever compete with wind and solar, anymore.
Tritium doesn't grow on trees. Making it by blasting those hot neutrons into a thousand tons of FLiBe is easy enough. Getting your few grams a day, at PPB concentration, out of that thousand tons of stuff is... nobody has any idea how. But you need to, to have fuel for tomorrow.
No, there won't be any of that. It would be fantastically more expensive than fission. Fission is not competitive, and gets less so by the day. Fusion is nothing but a money pit (with the just barely-possible exception of D-3He).
Saying that there are unknown engineering challenges is kind of a "duh", otherwise we wouldn't be researching we would be implementing. As you also mentioned there are other alternatives which we could consider than tritium.
> Fission is not competitive, and gets less so by the day.
> Fusion is nothing but a money pit (with the just barely-possible exception of D-3He).
We genuinely don't know if fusion is a money pit or not, because we don't have any idea how much a successful form will cost. Tritium blankets may be easy or not. Maybe helion's D-3HE will have a breakthrough. Maybe it's ICF.
I've not seen suggestions by anyone that wind and solar build-outs stop, or get diminished. Indeed at this point because the cost are low, industry will continue to invest in them regardless.
However, we will need a lot more energy production than folks think. We need to decarbonize the atmosphere. And that's going to require a lot of power.
All that aside, solar and wind are not getting you to mars in a timely fashion. We have reasons to research fusion that escape large commercial power generation.
We certainly need a lot more energy production but to decarbonize the atmosphere there has be a target level with an evidential basis on which to proceed. If it's considered (as by many) that we have a climate emergency (though this is not reported at all in the IPCC report) then any decarbonization at all will obviously serve for starters. If this is not the case then there are other considerations such as the fact that as CO2 levels have risen so has global food production - about 30% over the last 30 years.
Simulations with multiple global ecosystem models suggest that CO2 fertilization effects explain 70% of the observed greening trend, followed by nitrogen deposition (9%), climate change (8%) and land cover change (LCC) (4%). CO2 fertilization effects explain most of the greening trends in the tropics, whereas climate change resulted in greening of the high latitudes and the Tibetan Plateau..
https://sites.bu.edu/cliveg/files/2016/04/zhu-greening-earth...
This is not a surprise given that carbon is needed for plant growth, a fact well understood by commercial growers who pipe CO2 into their greenhouses. So one issue might be, if decarbonization is successful then what might be the acceptable level of reduction in global food supply?
Another issue relates to temperature. From an analysis of 974 million deaths in 384 locations across 13 countries it’s been concluded that twenty times more people die from the cold as from the heat.https://composite-indicators.jrc.ec.europa.eu/sites/default/... A recent paper (Dec 12 2022) regarding heart attacks states “extreme temperatures accounted for 2.2 additional deaths per 1,000 on hot days and 9.1 additional deaths per 1,000 on cold days.” Circulation. doi.org/10.1161/CIRCULATIONAHA.122.061832.
Do any of the reports present an ethical problem? No, they do not given an extreme interpretation of climate models.
They put 2.05 MJ in and got 3.15 MJ out. The 300MJ comes from equipment inefficiency- remember this is a research facility to prove plasma physics and ignition itself, mainly designed for weapons research. The fact they got definite fusion at 1.53x input energy is not just breakeven, but net gain. This is the first time ever we've gotten net gain, in any fusion device. A lot of people in gov leadership thought fusion was impossible, the fact they can said they did it, despite 1980s era technology and a shoestring budget for decades, really says something.
ICF is just one fusion technology, and probably not the best one, but fusion in general will probably see a massive budget increase worldwide now that it's been proven it can be done.
I fully expect a working fusion plant of some kind by 2030, assuming funding increases. Once we get them commercialized; coal, wind, solar and other power production will be obsolete for the most part. We can also use fusion heat to separate waste into it's base elements (you can recycle anything!), and help make any process needing a lot of thermal or electrical energy more efficient.
Regular fission plants take 5+ years to construct. So in the next 2-3 years you're expecting this technology to become mature enough to roll out and hook up to the power grid?
By 2030? Lol. Lmao, even. 2050 if we're lucky--we're still not even close to developing a reactor that can produce meaningful amounts of power for a sustained period. It'll take decades more to get to that point, and once all the technical limitations have been crossed and we figure out how to get enough tritium, the actual hard work begins of getting it online and integrated with our current system. Fusion is distant future technology.
To have a hope of supplying grid power, they need to scale up the energy gain by four orders of magnitude and reactor run time by 12 orders of magnitude.
Those are just two of the engineering problems. It'll be a while, and I doubt it will ever compete with solar, wind, and storage.
Remember that the human genome project was started in in 1990, completed in 2003. In 2003 it took a year to sequence 80% of the human genome, now it takes a day to do 98%. In 1990 they had the understanding on how to sequence DNA, but scaling that across whole human genome seemed like monumental task.
I'm guessing in a decade we'll have a viable early stage industrial process, and in 2 decades we have commissioned fusion reactors.
If we don't kill the planet with nuclear war or the climate crisis.
> Maybe not on earth, but there are applications in deep space.
Depends on how long the interstellar craft is supposed to travel. If it's under 100 years, fission should be able to do the trick of keeping the craft warm and the lights on for the sealed ecosystem to function during the decades of coasting between stars.
Fusion rockets would be more convenient than fission ones because you can store the hydrogen you need in the form of water and water also acts as a great radiation shield while in deep space. Then, to brake, you use your radiation shield as reaction mass for fission or fusion rockets.
If we are talking about much more than that, fusion is probably a better answer as fission fuel will half-life itself into paperweights over a grand transgalactic tour.
You'll need a constant power supply for the entirety of the trip. After doing the math, it turns out, fission seems quite viable - all usual fuels, in storage, have half-lives of more than 10,000 years.
So, if you have enough fissiles for keeping the closed ecosystem happy for the duration of the flight, you can go quite far.
The ship/colony will need to enter orbit around a star and drop by a rocky planet at some point, to gather more fissiles and reaction mass (and other materials needed for fixes and upgrades), so it wouldn't be able to stay indefinitely in deep space. If it's fusion-driven, a gas giant may be a good option for both fuel and reaction mass, and icy moons may work well for replacing water.
I'm not talking about travel. I am talking about permanently living between the stars, and essentially living off hydrogen harvested from the interstellar medium.
You are operating under the rather naive assumption that the people and resources used in fusion research are fungible with the people and resources used in other things.
When you have a bunch of people who know how to build nuclear bombs sitting around with nothing to do, you damn well keep them busy before another country finds them a job.
Having worked in the field for 6 years, the estimated cost per shot at NIF is roughly $1MM. The estimated cost for a day of shots at OMEGA is $250k-$300k.
The cost per target varies a lot due to the precise manufacturing tolerances and the methods to get them. For example, the sphere with the fuel in it is made by dropping liquid glass from a drop tower. And then metrology is done on hundreds and hundreds of glass spheres.
So though the electricity might cost that, we are talking about a building in which just the lasers and their optical paths take up 3 foot ball fields of advanced warehouse space. And the target chamber is at ultra high vacuum, which is 10 meters in diameter. There are also countless diagnostics, computers, and other electronics, the lights for all the facility, and the number of people required to run it so this delicate experiment goes off without a hitch.
Honestly, it's almost not worth talking about as a power source anytime soon. Even if Q > 2 on NIF there are countless engineering problems that would have to be overcome (and haven't really been thought too hard on in the ICF field) to get a power reactor out of this tech.
My two cents, look towards MIT and CFS for news on their SPARC tokamak and plans for ARC tokamak. Based on some data I have seen, SPARC should hit Q>1 pretty easily. With some estimates of reaching Q> 3 to 9. And before you scoff at it, this reactor design is using magnetic tech that has proven it can withstand and produce a 20T magnetic field! In MCF, field strength and heating are the two key metrics. To put this into perspective, the massive tokamak being built in Europe has a MAX possible field strength of 13T, assuming it's run to the edge of it's theoretical design limitations. The SPARC one hasn't even been run to it's design limitations, most likely due to the mechanical stresses a 20T field produces in a 3-4 meter D coil.
Give it a read. Intertial confinement fusion with a Q > 1 may very well point the way towards a realistic powerplant. Fusion is at a point it deserves investment. The NIF cost about the same as a single b-2 bomber.
Total annual US public research into fusion that is not ITER or this project (which does not come from the fusion pot. As others have said this project was built for weapons research) is about $300 million per year[1]. We, as a country, do not give a rats ass about having fusion as a power source. VC funding recently (for the first time) supported many fusion startup ideas at or above the $300 million level. Too bad ZIRP is over and private funding is now likely to dry up. The recent private funding in fusion was really amazing. Many new ideas and methods are being tried. This is what fusion research needed. Not spending billions on magnets and cement in France that is ITER. Hopefully some of them have a long enough runway to get promising work done and get follow-up funding.
We may be certain, anyway, that all the startups will spend all the money they get. There will be no power generated. The investors will not get any of their money back.
That would be fine, except some of the investors are pension funds.
As I understand it that's largely because some of their equipment is outdated or not optimised. Building a new facility from scratch would result in a much lower power consumption from the grid.
given how long it takes to build these places, I feel like the number of iteration cycles is an important metric, especially when other options are not standing still.
I do understand that with more focus things can happen faster but you can really only pour so much concrete per day. Hopeful that we can figure out "leaner" ways to get this done.
I would love to see progress on this stuff, just don't like the idea of betting on successive megaprojects in the age of "a website is hard".
If it takes 300 one time in the future to start a long term reaction that’s a price I think we would all pay. The fact that we proved the idea now makes this just an engineering problem.
I'm not trying to undermine but more just act as a reminder to some folk. There is no such thing as "just an engineering problem". These things do not exist in a vacuum. It is a part of the tripod of economics and politics. If any one of these three buckles, the whole thing doesn't happen. Concord was an engineering problem we solved, but the economics and politics around it made it a white elephant. Here's hoping this doesn't happen to this.
Now add in the 50% efficiency of conversion of heat to electricity (and that's very optimistic), and it's only 1.5 megajoules, or 0.5%, right where I calculated it based on initial information.
IANAP, but I see no path forward to sufficient Q-total using plasma fusion to put this to any practical use. Unless the reaction can somehow be self-sustaining, I do not believe this will ever work.
don't you think it's nuts that your arrogance would have you believe that the people on the ground doing the experiment know less about the power consumption and output than you do? Because I do.
There's a lot of supporting stuff as well as energy to drive that stuff that goes into leading edge tech development like this, that does not matter in terms of the reaction itself.
If they say they achieved more output than input, then I will believe them over a random HN comment snob any day of any week.
I'm sorry but you sound much more arrogant here with your strange assumptions. AFAIK all of those things he said are based on the data from the experiment. And I've read the same (i.e. that it's very far from being energy efficient).
> will believe
Wouldn't be better if you were able in to verify stuff to some degree yourself instead blindly trusting every expert (not at all implying that the people who did this experiment are untrustworthy but your bound to run into some bad apples with this attitude eventually)
The paper was very clear what was involved and that information was poorly communicated. The experiment used X energy from a laser pulse to get Y thermal energy. The issue w0mbat referred to is to be self sustaining you need to use the output energy to drive the process. Aka the important number is after all relevant losses including electricity > lasers > fusion > heat > electricity, with each of those stages have losses. The reason they don’t communicate end to end efficiency is from a scientific standpoint it’s meaningless as they aren’t converting any thermal energy into electricity.
For example p + p Fusion releases neutrinos which then escape any practical device without depositing their energy as heat. This isn’t a concern with DT fusion but again the point is we don’t really care about the actual mass to energy conversion but rather the amount of useful energy obtained.
none of what anyone said is about a self-sustaining reaction.
this is about more power leaving the reaction chamber than what entered it. that's all the announcement is about.
this is NOT about how much energy it takes to ready the lasers. this is NOT about the electricity consumed by lighting, computers, cooling, or measurement or anything else-- none of that counts when you are measuring the efficiency of the reaction itself.
this is about more energy leaving the reaction chamber than went in.
understanding that is key to understanding the significance of the announcement, and this is significant.
and I maintain that the poster I originally called arrogant is arrogant, because they indicated in their comment that they knew how to calculate reaction efficiency better than the physicists doing the work. I called it arrogant because it is --objectively-- an arrogant position to take.
if that makes me arrogant, then so be it. my arrogance is independent of theirs and has no bearing on comments made before my own, and my comment did not influence theirs. (they were being arrogant before I pointed it out.)
> none of what anyone said is about a self-sustaining reaction.
Actual power plants are self sustaining as in they use the electricity they produce to operate, it’s mandatory though not sufficient for any commercial fusion power plant.
So, this isn’t about a different way to “calculate reaction efficiency better than the physicists doing the work” he was directly quoting their numbers from the paper. It’s only a question of communicating the meaning of efficiency.
> this is about more energy leaving the reaction chamber than went in.
The exact same energy was there before and after fusion only it’s form changed. It might seem pedantic to point that out, but if you don’t make it clear people will misunderstand.
Also, the applied laser energy also leaves the reaction chamber so any fusion would be net positive thermal energy by that yardstick.
> Actual power plants are self sustaining
Actual power plants are not necessarily self-sustaining and practically none of them can start if the grid power is not available. See: black start power plants and https://youtu.be/uOSnQM1Zu4w
If you look at the mass before of their fuel, 1 deuterium atom + 1 tritium atom:
2.01410177811 u + 3.01604928 u = 5.03015105811 u
vs the mass of the fusion products of 1 helium atom and 1 neutron:
4.002602 u + 1.008 u = 5.010602 u
You'll notice that even though we started with 5 neutrons and 2 protons and ended up with the same number there was some additional binding energy that is unaccounted for in the new configuration. This is the energy released by the fusion reaction via E = mc^2. Here we see the mass difference is:
5.03015105811 u - 5.010602 u = 0.01954905811 u
Converting that to energy you find that is 17.6 MeV. As you go up the periodic table fusing nuclei you will get less and less marginal energy until you get to iron where at that point fusion become net negative and fission is then takes over where breaking nuclei apart gains energy, marginally more as you go up the periodic table. That's why you want to fuse light particles and fission very heavy particles. It is also why there is so much iron as it is kind of the base state of both of these reactions.
Tangentially related, but I think this is an interesting fact, all the atoms in our universe/galaxy/solar system with a mass up to that of iron are formed in the core of stars in stellar fusion. Hydrogen fuses into helium, and as a star nears the end of its lifetime you get heavier elements like lithium, carbon, and so on. Under normal stellar fusion no elements heavier than iron will be produced, and iron is only element number 25. If you just looked at nucleosynthesis through the lens of stellar fusion, it isn't obvious that there should be any heaver-than-iron atoms at all in the universe.
These heaver-than-iron elements are created in a very interesting and exotic process. When a large enough star dies it explodes in a supernova, and a huge amount of energy and neutrons are released in a very short period of time. This supernova generates enough energy and neutron material that small amounts of heavier elements like gold, platinum, etc. are created through exotic nuclear fusion reactions, even though these heavy fusion reactions are energy-absorbing.
It's interesting to think when you're wearing jewelry made from gold or platinum, all of those atoms in your jewelry were created during the death of a star.
“The nitrogen in our DNA,
the calcium in our teeth,
the iron in our blood,
the carbon in our apple pies
were made in the interiors of collapsing stars.
We are made of star stuff”.
– Carl Sagan
We have calcium in our bones,
iron in our veins,
carbon in our souls,
and nitrogen in our brains.
93 percent stardust,
with souls made of flames,
we are all just stars
that have people names"
Nikita Gill
I understand this is a poem that is focused on artistic expression and not scientific accuracy, but I find the line about “carbon in our souls” to be out of place. I guess the rest of the poem is incidentally correct (when not abstract)
I think it might be an allusion to alchemy. Basically, the alchemists believed that ash (what was left after burning something) was the soul of all things...And-- this is where my complete lack of understanding about science shows-- I'm pretty sure Ash has lots of carbon? It's, you know, poetic. Many have claimed that poems are the "language of paradox" so it's okay for it to be a little non-literal. My interpretation of it, though, is that the soul is something impure that you must burn away, or maybe that the soul is polluted by our own words and behavior. It's definitely not meant to be scientifically accurate.
Sure, the word “soul” comes from the proto Germanic “saiwiz” (for sea or ocean).
But not because “you are like a drop in the ocean,” but because “you are like an ocean in a drop.”
The idea of soul can be objectionable when it is based on an immortal being or on a vitalist life-force (like “anima” of the Latin). But it seems fine when it is based on the psyche (like the “Psuche” of the Greek).
I embrace taboo words like soul because they 1. are common 2. are useful for referring to things that seem pretty important (like avoiding soulless companies or products or buildings) and 3. are challenging to my normal (scientific) understanding of the world.
Still, I’d be more comfortable if the poem referred to the “carbon of our souls” rather than “carbon in our souls.” Hmm…
You could define soul as the fuel engine for life, which is basically burning carbon. As long as that furnace is functioning you're alive == you have a soul.
You and I are complicated but we're made of elements
Like a box of paints that are mixed to make every shade
They either combine to make a chemical compound or stand alone as they are
Quadrillion now, but costs will drive down and once is turned to gold, it stays gold. In the far future where all gold was mined this will be the only process left to get more. Or explode stars and capture gold from them.
I joke with my Son all the time about turning Mercury into gold (He works in the nuclear industry).
Once you get past the enormous energy costs to do this you have a secondary problem, all the gold produced this way is radioactive and it beta decays to.. Mercury.
Actually current modeling has supernovae as being only a small contributor to the measured abundance of heavy nucleii. These guys tend to come from a more exotic source still: material thrown off as a neutron star is tidally disrupted during a merger event with another neutron star or black hole.
I'm not sure but this has some interesting info such as-
>Some whole galaxies have average metallicities only 1/10 of the Sun's. Some new stars in our galaxy have more metals in them than the original solar nebula that birthed the Sun and the planets did. So the amount of "metals" like oxygen and carbon can vary by a few orders of magnitude from star to star, depending upon it's age and history.
Phosphorus is supposed to be rare in the wider world. Considering how central it is to everything important, life might find it very difficult to start in our temperature range, without.
I'm always fascinated by the sheer and unfathomable amounts of energy that is thrown around in these events. Just thinking about the fact that a single spoonful of neutron star matter contains more mass than Mount Everest fills me with wonder about the world we live in.
What happens whena tablespoon of neutron soup gets thrown out of the well of a neutron star? Does it suddenly expand to the size of everest? Where do the electrons come from?
In terms of "where do the electrons come from", an ordinary neutron in free space has a half-life of about 10 minutes, decaying through beta decay which produces a proton, an electron and a neutrino.
That sounds fairly energetic... So after 10min you'd have some odd mix of heavy elements probably approaching a decent fraction of the volume of everest. Half the volume of everest x (densityofgranite / densityoflead)
That kind of expansion rate has to rival any explosion imaginable.
It's essentially a giant atomic nucleus, so absent a star's worth of gravity holding it together it's going to decay rapidly into stable isotopes. So essentially it would act more or less the same as a huge fission bomb of the same mass.
I'd imagine some of the energy and degenerate matter consisting of neutrons would convert to protons and electrons, and nucleosynthesis would take place to form elements.
I have no idea, though, but I'm pretty sure I watched a video about this.
So that means that for life to form, we probably need a star to die so that the heavier atoms used in complicated life forming chemical reactions (correct me if i am wrong here as what I'm about to say depends on it), hence it could be the case that if the universe is 13.5 billion years old, then we humans are appearing in the universe at the earliest possible time.
13.5 billion years seems like the time required to create a star, have the star die and blow up, have all that material settle and create a new star, then the planets are formed, than enough time on one of those planets needs to pass for life to form, then complicated life.
Not necessarily. First generation stars were, theoretically, enormous both due to low metallicity of the collapsed medium and a higher average concentration of said medium. These stars lifespans were extremely short, shorter that blue giants we see today. So novas due to the death of these stars happened fairly early in the lifespan of the universe (talking about few million years after the big bang).
Therefore, life could have developed in a few tens to few hundreds of millions of years after the big bang. That's still true even if we assume that heavier elements are created mainly when neutron stars collide and not by super/hypernovas as we theorized before LIGO/Virgo observatories.
Consequently, we likely are not a "progenitor" civilization in the universe if we only consider planets formation. We might not see anyone out there either because there's a great filter for intelligent life to emerge (so the bottleneck is in our past) or because few/no civilizations get to have an impact on their host stars (the filter is in our future) that would allow us to see them.
Basic life (single-celled?) requiring the elements above lead might have a chance at that time, but complex life like us wouldn't do so well if there were still supernovas going off left and right. There's a theory with decent evidence that at least one of the mass extinctions was caused by a supernova: https://www.space.com/supernova-caused-earth-mass-extinction...
That being said, I wasn't aware of how LIGO changed the understanding of how heavier elements are usually formed, guessing it changed the expected neutron star prevalence? Do you have any additional reading on that?
You are right about supernova hampering life evolution, but it's unclear how long the fireworks lasted. In my comment I argued that it is possible to have the conditions of life emerge much earlier than 13.5 billion years. Not that it necessarily happened.
Regarding the second point have a look at https://www.ligo.org/science/Publication-GW170817Kilonova/in... . That isn't my field of specialization, so I am not sure about recent publications. At the time though this was a big deal as kilonovas seem to be the primary source of heavy nuclei in the universe. That particular event crested between 1/100th to 1/1000th solar masses worth of heavy ( heavier than iron) nuclei. This is a greater rate than supernovas estimations.
> how LIGO changed the understanding of how heavier elements are usually formed, guessing it changed the expected neutron star prevalence?
It's not about the prevalence, but about the light curves observed during the event AT 2017gfo. They indicate significant heavy metal ejection but, what's interesting, also production.
> mergers of neutron stars contribute to rapid neutron capture (r-process) nucleosynthesis
I'm just a layman but I believe by the time our sun has formed, we've gone through multiple star cycles. The early stars were very pure - made basically purely of hydrogen (maybe some helium?). They were huge, burned very bright and died comparatively quickly. Each time stars died, more heavy elements (and heavier elements than before) were produced. Over time the heavy element content (called metallicity) has increased in all stars. I believe there are also theories of white dwarf mergers undergoing runaway fusion and a lot of heavy elements being generated during the explosion.
You raise an interesting question though: what is the earlier point of time where the heavy elements were abundant enough for life (as we know it) to form? Just because we started existing at +13.5 billion years, it doesn't mean carbon based life couldn't have formed much earlier.
Very much a laymen also, however funnily enough I was listening to a bbc program called in our time, a couple of nights ago, where a similar topic was discussed one comment was that life is carbon based and for carbon to exist a star has to die, so yes therefore we are in the early stages. Will try to fin the episode….
I have zero ability to answer your question but I would love to know about about this. If life (like we know it) requires the explosion of aged stars, what is the earliest it would take. What is the minimum time needed to form, grow and explode a single star? Has there been time for this to occur 10s, 100s of times since the Big Bang? (obviously they can happen in parallel, but I'm thinking about how many in series).
> 13.5 billion years seems like the time required to create a star, have the star die and blow up, have all that material settle and create a new star, then the planets are formed, than enough time on one of those planets needs to pass for life to form, then complicated life.
Maybe for a main sequence star, but there other processes that involve nucleosynthesis.
Iron is always spoken of as the dividing line, but I'd like to know whether iron is exactly on the line, on one side (which?), or it depends. IOW, does fusion of iron atoms release energy (hydrogen side of the line), absorb energy (uranium side of the line), neither, or either (depending on conditions)?
That's because the periodic table is essentially about chemistry (i.e. about electron orbitals), not about nuclear physics (i.e. the atom's nucleus). For example it doesn't talk much about isotopes, aside from usually reporting the average atomic mass.
It will happily fuse further. It just won't support the outside of a star against gravity, while doing it. So the star collapses, fuses lots more stuff even heavier than iron, and then explodes. Most of the iron and heavier stuff fuses into the core of a neutron star, the ultimate in energy-consuming fusion.
Iron will not happily fuse further because this NEEDS energy and where would that energy come from?
"heavier than iron" elements are produced when a star explodes because that collapse produces enormous amounts of energy.
During the collapse, the outer edge of the star is accelerated to something like 20% of the speed of light, that is an ENORMOUS amount of energy slamming down on the core.
Lastly, neutron starts don't produce energy, they are the incompressible remnants of a dead star.
You answer your own question: the energy for further fusion, all the way to neutron degeneracy, is provided by gravitational collapse. The outer layers fusing provide energy for the explosion.
If I understand this right, I think some of the lighter than iron elements are also created in those exotic processes. But yes, unlike for the heavier-than-iron elements, the lighter ones are _also_ created in normal stellar fusion.
And on a total tangent, this fact played a part in worldbuilding done by the author L. E. Modesitt, Jr.
> When I initially decided to write The Magic of Recluce in the late 1980s, I'd been writing science fiction exclusively... I conveyed a certain dismay about the lack of concern about economic, political, and technological infrastructures in various fantasies then being written and published in the field...
> I faced the very real problem of creating a magic system that was logical... Most fantasy epics have magic systems. Unfortunately, many of them, particularly those designed by beginning authors, aren't well thought out, or they're lifted whole from either traditional folklore or gaming systems and may not exactly apply to what the author has in mind.
> I began by thinking about some of the features and tropes of traditional fantasy. One aspect of both legend and folklore that stuck out was the use of "cold iron" to break faerie magic, even to burn the creatures of faerie, or to stand against sorcery. Why iron? Why not gold or silver or copper? Not surprisingly, I didn't find any answers in traditional folklore or even contemporary fantasy. Oh, there were more than a few examples, but no real explanations except the traditional ones along the lines of "that's just the way it works."
> For some reason, my mind went back to astronomy and astrophysics and the role that nuclear fusion has in creating a nova... Each of these fusion reactions creates a heavier element and releases energy... The proton-proton reaction that produces iron, however, is different, because it is an endothermic reaction...
> At the same time, the fact that metals such as copper or silver conducted heat and electrical energy suggested that they were certainly less than ideal for containing electrical energy. Gold and lead, while far heavier than iron, do not have iron's strength, and other metals are too rare and too hard to work, particularly in a low-tech society.
> At this point, I had a starting point for my magic system. I couldn't say exactly what spurred this revelation, but to me it certainly made sense. Iron can absorb a great amount of heat. If you don't think so, stand on an iron plate barefoot in the blazing sun or in the chill of winter. Heat is a form of energy. In fantasy, magic is a form of energy. Therefore, iron can absorb magic and, by doing so, bind it.
Do we know how much tritium is needed for a city's energy generation? What about a state etc? Reason I ask is the only uses I have seen for tritium is on old watch dials made in the pre-90's. Curious how much of this resource is out there.
Tritium has a half-life of 12 years so there is no deep pool of tritium upon which to draw. The primary source of tritium on earth is cosmic-ray interactions in the upper atmosphere that produces 7.5Kg of tritium a year worldwide.[1]
Isn't that going to cause a serious problem if it requires 323Kg/yr of tritium just to power New York City?
Apparently no, because it can be made by the fusion process itself, via contact with lithium, and there's enough proven reserves of the latter to supply us for 100s of years.
Nobody knows how to get tritium at PPB concentration from the thousand tons of radioactive molten FLiBe it is dissolved in. You have to process all of it every day because you need that tritium for fuel tomorrow.
> According to a 1996 report from Institute for Energy and Environmental Research on the US Department of Energy, only 225 kg (496 lb) of tritium had been produced in the United States from 1955 to 1996.[a] Since it continually decays into helium-3, the total amount remaining was about 75 kg (165 lb) at the time of the report.
The concentration of deuterium in the ocean is about 150-160 parts per million and with 1233.91 quintillion liters covering the earth we have approximately 8.2260667e+12kg worth of it to extract, so we've got a bit to work through!
Tritium however is far more rare with only trace amounts of it being available within nature and barely more than a kg produced per year. Producing the 100s of kgs required per year still seems to be an unsolved problem, although my quick searching shows there's a couple viable solutions for it.
The solution is that fusion power plants can breed tritium and become net producers of it...
Though in practice enough will be lost that probably they'll still be somewhat net consumers-- just not nearly to the extent predicted by a simple thermodynamic model.
Still, even if fusion becomes a net producer of tritium, the whole tritium-is-hard-to-get problem will likely be a constraint that we'll be fighting as we ramp up use of fusion power in the future.
That such a small amount of matter could generate so much power is pretty remarkable. I might be way off base but from what I can find online it seems you'd need over 2000 times as much uranium?
Tritium is very popular on gun sights as well- as it's a glow in the dark sight that doesn't need to be charged. I'm now questioning the practice of appendix carrying with tritium sights.
Tritium decays by beta-decay (an electron). The electron can not travel very far in air (1/4 inch), and is stopped by even the thinnest piece of metal. It's even stopped by the dead outer layer of your skin.
Not quite: beta decay will penetrate the skin enough to damage living tissue - beta burns are what caused the fatalities of the Chernobyl first responder fire fighters.
They spent a few hours covered in dust on their coats, and did a bunch of subsurface skin damage which manifested as third degree burns. Sepsis, not radiation poisoning, generally killed them.
Well, it lasts for several years, but considerably less than even a human lifetime: Tritium's half-life is only about 11 years, so gun sights, dark-proof glow-in-the-dark signage (usually reserved for critical industrial plants, ships and offshore platforms due to expense), etc, will become seriously degraded in just a few years. (Since the glow is directly proportional to the remaining low-level beta radioactivity, which can barely penetrate the glass envelope in the first place - you'd get more radiation (from radium) living in a brick house than carrying 24-7.)
FWIW, tritium and a phosphor granule encapsulated in glass microspheres have been developed for self-illuminating runway paint, but again, no one really uses it because tritium is stupid expensive, and again, it' loses half its brightness in only a decade.
On the other hand, I've been told that Trijicon will replace their tritium gun sights for the lifetime of the original owner. I plan to live long enough to cost them money...
And the (otherwise excellent) channel is supposed to soon post an adv... I mean informer... I mean "exclusive documentary" about "repeat after me we're totally not a scam - we just play one on YouTube" fusion startup Helion.
Which I'm going to watch, because even though everything I hear about this company gives me insane Theranos vibes... Well, if they pull it off... They might light a bulb with fusion in my lifetime.
If I understand correctly, tritium is the result of bombarding lithium with neutrons.
I'm not sure how tokomaks are expected to work; do you just add lithium and expect the tritium to get where it needs to go to keep the reaction going, or do you actively remove gases from vessel, filter out the tritium, and re-use it as fuel?
Either way, I don't imagine it'd be too hard to recapture the stuff.
> It is also why there is so much iron as it is kind of the base state of both of these reactions.
I always thought this is the interesting takeaway, both fusion and fission funnel their constituent matter towards iron, as the final stable state of matter. Far future civilizations will have a lot of iron on their hands.
It's not "creating" energy, it's releasing energy.
The original atoms (exactly which atoms depends on the reactor, but let's assume it's deuterium and tritium) have a certain starting energy. When you fuse them together the resulting atom (helium-4, if you start with deuterium and tritium) moves it into a lower energy state.
Since the fused atom has lower energy than the input atoms, the fusion reaction releases the difference in energy, which you can then capture.
You're using X energy to release Y energy from the fuel (the pellet, containing deuterium and tritium). It was there already, just not in a usable form.
You have a piece of wood. You ignite it with a match. This causes a self sustaining reaction in the form of fire that releases far more energy than the match could ever create.
This is effectively what is happening with any energy generator.
Mass and energy are equivalent. You're using X energy to reduce the mass of your fuel and converting that mass into Y energy. When X < Y you have useful energy production at the cost of the mass of your fuel. Energy is conserved.
The reason nuclear fusion is such a desirable goal is because it only takes a relatively small amount of mass to convert into a relatively large amount of useful energy, and the mass (the fuel) is relatively easy to obtain.
Like all energy generation, it's converting one type of energy into another, more convenient type, to do useful work. Like a hydroelectric dam converting the potential energy of water into more useful electrical energy. Energy is conserved when water spins a turbine, it's just that electrical energy is more useful for work than the potential energy of the water. Of course you can still use the potential (or kinetic) energy of the water directly, such as with a water mill. But the energy to work ratio is worse in that form (especially if the work to be done is far away from the watermill).
Whenever you build a fire you need to input some amount of energy to begin the chemical reaction that releases energy. In this instance we get not electrical energy, but energy in the form of infrared and visible light, to heat our home and light our way. Yet the total energy released by the fire far surpasses the energy you used to start the reaction, but because the wood's mass is consumed, energy is ultimately conserved. You have converted wood (not useful for heating your home) into infrared light (useful for heating your home).
Mass is energy at rest, hence equivalence with exception of massless particles like photons that have zero mass and non-zero energy. Also, photons travel with the speed of light in vacuum and cannot be found at rest in any frame of reference. Modern physics is fun, isn't it?
P.S. Neutrinos were thought to have zero mass as well, but according to the Standard model they have mass.
A daily life parallel: You can use a lighter to put a small amount of energy into a bunch of wood to extract more energy than you put it. In the case of fusion, this energy is coming from fusing hydrogen atoms, rather than a chemical reaction.
Energy is released when two atomic nuclei combine to form a larger atomic nuclei.
There's a threshold of energy required to attain this fusion reaction (otherwise there would be no light nuclei in the universe), and once the nuclei combine there's energy that is released, similar to how some chemical reactions can be exothermic in nature.
The Y comes from the mass of matter converted in to energy. The mass of the material before the reaction is greater than the mass of its products.
The law you're referring to might be the conservation of energy, but that applies to non-nuclear reactions and is more accurately called the law of conservation of mass-energy. In this case the energy in times the mass at the start is still equal to the energy out times the mass at the end. For the energy to increase, the mass must decrease to produce the Y in your equation.
The energy is "released" from the binding energy of the nuclei. It's similar to how throwing a bottle of nitroglycerine can generate a huge explosion, even though you use a tiny amount of energy to throw it.
The input energy X is used to create the conditions of high temperature and pressure that are needed for fusion to take place.
When fusion happens, two hydrogen atoms fuse together into one, losing a bit of mass in the process. The mass difference is converted into energy Y (using E=mc2).
In this case, Y was greater than X, so there was a net gain in useful energy.
Except… chemical bonds don’t ‘store’ energy. Molecules are a low energy configuration. It takes energy to rip them apart!
But, O2 molecules, with their double bond, don’t take much energy to break apart. If they do, and then pair up with say a bunch of Hydrogen and Carbon atoms that were nearby in some long chain or something, they form bonds that are stronger - that take more energy to break - and you end up with some leftover energy. Water and CO2 molecules are an even lower energy configuration.
but the extra energy you get wasn’t exactly ‘in’ the oxygen bond though - any more than when you have a ball at the top of a hill it has potential energy ‘in’ it.
sometimes, when two small particles fuse, they become a single larger particle, but the larger particle's mass is slightly less than the sum of the masses of the two smaller particles. The slight difference becomes energy released, and the amount of energy released, roughly speaking, is E = mc^2.
I think it's more like a release of potential energy; kind of like how you can lightly nudge a large object teetering on the edge of a cliff and it'll make a huge splash at the bottom. It took a lot of energy to create the big splash, but you didn't need much to trigger it.
Y is the energy that was holding some hydrogen atoms together which you have liberated (while destroying the atoms in question but that's OK cause it's abundant).
equally confusing is the BBC article which mentions the energy used for the reaction does not include the energy needed to power the lasers, which renders it a net loss
> "had to put 500 megajoules of energy into the lasers to then send 1.8 megajoules to the target - so even though they got 2.5 megajoules out, that's still far less than the energy they originally needed for the lasers," says Tony Roulstone of the University of Cambridge.
But it's good to finally see progress. Very few technologies can transform the world the way a practical fusion reactor could.
It was discussed a lot in the threads about this yesterday, but apparently the lab had relatively inefficient lasers. Newer ones are an order of magnitude more efficient
This is what I captured from the press conference:
300 megajoules was used to generate the laser (this is also captured in [1]). They also mentioned that newer lasers have 20% wall plug efficiency. If so, they need to improve the energy output by 5x in order to break even relative to wall plug energy consumption.
I don’t know why but this caused me to picture Alec from Tech Connections in a few years time, showing off his fusion laser plugged in to a kill-a—watt, while he explains carefully, through the magic of buying two of them, why you can get more power out than you put in, and why these old inertial confinement fusors were pretty neat actually.
That's just the lasers, the rest of the plant needs power too. Big water pumps are big power hogs, as is the rest of the supporting equipment that any power plant requires to operate. Far over "wall plug" break even is required for commercial viability.
Just bootstrap a second fusion power plant with the first, then continue on, similar to how compilers for a language can be written in the language itself.
Bootstrapping the power plant isn't the problem. The economics of the power plant are the problem, naming producing a worthwhile surplus of power, after accounting for all the power needed by the plant itself.
So if we had the 20% wall plug efficiency, then we need that 3.2:2 ratio to become 10:2 to hit break-even. That looks like a huge gap but maybe there are tricks here that resolve this once you get over the initial gap.
That’s only because they’re using old school flash pumped lasers, not the new solid state lasers you’d use today if you wanted to make a power plant demo.
I'm not really seeing any convincing numbers there. Mercury lasers seem to only be 10% efficient. I get that this is better than the lasers that were just used at NIF, but that still seems pretty far from useful.
Because they are researching inertial confinement fusion, not trying to build a working power plant. The efficiency of the lasers doesn't matter, since it doesn't affect their research.
a typical power price at trading hubs is US$40 per megawatt hour, though this varies considerably depending on many factors and is sometimes actually negative
a typical retail price is US$120 per megawatt hour
I think you're out by some orders of magnitude. With the current energy issues in the UK it'd be under £100. Other things suggest in California it's more like 20 cents per kWh so were you thinking ~$20?
It's a very old lab, and replacing them isn't cheap/easy.
You don't need to use efficient lasers to get the scientific results they're after - other people have already very accurately measured the properties of modern lasers, so we can predict how they would perform without having to actually use them.
That's not the purpose of the research, though. They are solely focusing on the energy transfer between the lasers themselves, and the output from the reaction. It's not clear that higher energies or bigger targets will teach us anything new.
Upgrading the lasers would slow the project down as new hardware is installed and issues are worked out. Not to mention I doubt the new hardware is cheap, and may be more expensive than burning excess energy using old laser tech in the meantime.
Other research groups work on laser efficiency, and the "final product" using this method (if it ever proves viable) would put together all the best pieces to get the best efficiencies.
The electricity bill is like $2000. It'd waste more money for a manager to think about how to replace equipment, at California pay rates. This whole thread is making me lose faith in HN.
Don't take it too personally, but you, and many others here, need to rethink their approach. You see a short tweet without context about a topic you clearly know nothing about (which is totally, fully okay, it's a complex topic), and think you are now able to criticize milestones in this impossibly complex topic.
Not even ask questions, not something like "hey, I saw this tweet, I know it's just a tweet, but can someone help me understand context?", no, you actually go ahead and criticize work that you know nothing about, and when confronted, you double down.
On some level, you must know yourself that it might be better to ask as many unloaded questions as you want, but otherwise sit this one out in terms of assessment.
There is no “context” to understand. Yes, it’s an impressive feat. Yes, other laser designs might fix the huge ignition costs. But that hasn’t happened yet, and until it does, it’s completely fair to point that out.
Will it win me any friends? Probably not. It’s like showing up to a party and saying the reason for the party is mistaken. Very few people care.
But scientists should, and I am one. Doubly so for incorrect reporting to laymen. We have a responsibility to convey what was actually achieved, not what we wish was achieved.
People do seem to be getting emotional about fusion, and pointing that out is hardly edgy.
Once fusion achieves more output than input, I’ll be celebrating right there with you. But until then, ignoring the Doberman in the room is a worse look, from a scientific standpoint.
I even cited a source from someone with a phd in mathematical physics, who is likely far more qualified to be talking about this than most of us here. So in terms of dismissing criticism, the stack seems to be in the other direction.
Scientific reporting matters. Reporting something false is generally a bad idea. Saying “we got more energy out than we put in” is false. Which link in this chain of reasoning is invalid?
> It just seems a little strange to take credit for a milestone when the milestone everyone cares about is yet to be reached. (More energy out than in.)
That comment/criticism is a little strange in and of itself. I would say it's the oddness or seeming petulance of the above comment that brought on boc's comment.
A silly, but illustrative analogy:
Kid: Dad, look! I scored a home-run!
Father: Who cares? Have you won the game yet? Stop celebrating until you do something that everyone cares about!
I agree.
To explain further, my post and the included (noted silly) analogy were made with respect to explaining the aspect of sillysaurusx's post(s) that boc seemed to be criticizing.
Your "honest feedback" is nothing more than naked insults.
Sillysaurusx is right. The "impossibly complex" matter is actually quite simple, Q=1 is little more than a psychological milestone, not some sort of technical tipping point where further progress becomes easier. And they haven't even gotten to Q=1 unless you buy into the justifications they give for dodgy accounting of the energy they put into it. The "impossibly complex" matter of commercial fusion is actually quite simple, it needs to put out a lot more energy than you put into it after you fully account for all the energy you put in. They aren't even close to this.
When a baseball batter hits a ball at a record 120mph, you calculate the impulse of force (∆p) they put into the swing to cause that result, not the total calories the player consumed during the past year in order to build their muscles.
You're arguing that the process of charging some inefficient lasers (aka eating food throughout the year) is invalidating this entire result. That was never part of the experiment, nor is it relevant to this test.
I understand exactly what you wrote above, and I'm telling you that it's not relevant to this discovery. You're arguing a non-sequitur in the classic definition.
Just as a note, since I made the same mistake initially, the person you're replying to didn't make the post from which you are quoting "impossibly complex".
It seems, to me, that boc was criticizing the unnecessarily dour tone of sillysaurusx's previous comment and not the technical aspect of the achievement.
The whole thing seems to come down to whether one interprets the announcement as an attempt to deceive the public at large or simply a celebration of a milestone that many in the fusion research community have been trying to achieve for a long time. I can understand it being interpreted both ways, but I think the more charitable interpretation is that science reporting, in general, doesn't usually properly explain the levels of nuance of various achievements and, as such, something that is genuinely exciting for those in the community is not necessarily as exciting for those outside of it - which comes across as deceptive.
You are entirely correct. I'll just add that it's not only about the energy put in, but ultimately about the cost.
Net positive energy output is the absolute basic requirement. We're not there, we're not close, and even if we were, the hurdle would be to make it economically viable.
I think the confusion here is at least partially due to most articles obscuring the primary purpose of the NIF. Its not supposed to support commercial energy development, its supposed to support nuclear weapons development under the Nuclear Test Ban Treaty, where testing bombs via setting them off is banned.
So the NIF is supposed to give a testbed to study implosion created fusion reactions that produce enough energy to "ignite", that is, propegate the reaction to the rest of a hypothetical bomb. In that case, the amount of energy needed for the infrastructure to produce the initial implosion doesn't matter, what matters is that the energy coming out is more then the actual energy that triggered the reaction, so that the hypothetical bomb would blow up and not fizzle.
Exactly! It's so strange this is somehow made to be about fusion as an energy source.
I'm happy to announce that Q>>1 has already been achieved on multiple occasions. In the cores of the thermonuclear devices tested by the US and USSR in the 50s.
(Unfortunately that technology proved unviable as a path to civilian fusion power.)
It is a milestone, and I do think the researchers deserve credit for that. Getting more energy out of the reaction than was delivered to it by the lasers is actually important.
No one (except perhaps poor science "reporters") is claiming that this means we now have free and cheap fusion power. Of course the energy put in to operate the lasers themselves needs to be accounted for -- and it is! -- but that doesn't make what they've achieved useless. It's also useful to remember that the researchers involved are not the people writing press releases and articles; let's not minimize their achievement just because of sloppy, sensationalist reporting.
I like the analogy downthread of a kid being excited about scoring a home run in baseball, but the dad chastising the kid for celebrating before actually winning the game. That's what it feels like is happening here.
This is a huge step in the right direction, and it should be celebrated as such.
It's a significant milestone because demonstrating you can get net energy from the reaction removes a lot of uncertainty of whether it's possible in the real world. It starts to turn inertial fusion into an engineering problem of how you increase the efficiency of each stage.
Worth noting that the milestone achieved was positive Q_plasma (more energy out of the plasma than in).
They are using inefficient lasers because they are cheaper to buy/maintain/modify for research purposes.
Determining the conditions for positive Q_plasma is largely a matter of science/research so the external system doesn't matter as long as the variables are controlled and results are reproducible.
Once positive Q_plasma is well understood/reproducible, achieving positive Q_total (more energy produced than spent running the infrastructure) is just a matter of engineering and potentially waiting for the SOTA for components (like lasers or materials) to catch up.
TLDR: This is the scientists proving the theory. Now it's the scientists' job to refine the theory. Then the engineers get to put it into production.
I can't agree that funding is "largely the reason" why NASA takes so long to do anything. I doubt funding is a top 3 reason.
NASA just isn't about high-risk / high-reward "moonshots" anymore. The overarching political environment doesn't allow it, never mind the office politics.
NASA will get back to the moon using easily an order of magnitude more funding than it should have taken, with a launch system that costs an order of magnitude more money for each launch than it should. (almost two?)
Have to +1 this. A lot (most?) of NASA's funding is directed toward keeping people employed and skilled, as opposed to accomplishing goals, as with a lot of government money. NASA could do a LOT more with the funding they already have, if they were willing to divest from older technologies and vendors, but the politics of its funding doesn't allow that.
I agree however that culture was caused by a lack of funding.
You can't be swift and lean when you are given very limited, budgeted funding. You can't take risks or you risk putting people out of a job and killing the program.
That leads to an overly conservative culture that restricts any risk taking and over-engineers everything to the point failure is effectively impossible.
This slow movement, overly conservative, design by committee approach helps limit risk but it absolutely balloons costs in the long run and horrifically delays progress. Of course if they were a company they'd eventually run out of money but that's not really an option for gov orgs so when the overly conservative, limited run designs end up encountering production issues, the projects explode in cost with nearly no upper limit.
TLDR: The political climate is a direct consequence of the lack of budget and continued restriction of that budget only worsens the problem.
> I can't agree that funding is "largely the reason"
> NASA just isn't about high-risk / high-reward "moonshots" anymore. The overarching political environment doesn't allow it, never mind the office politics.
Why doesn't the political environment allow for it. What could happen. What could regulatory bodies do to NASA for taking a risk and failing. What sort of constricting change could political bodies do in such a situation.
The funding senator became the administrator of the current moon attempt. The funding insist on using the old technology in the funding. All these sounded bad. If nasa has more freehand. But then the fund will not get back to the states …
"Funding" isn't really a good answer IMO. I don't know a ton about Fusion research specifically, but NASA is horrifically inefficient with money compared to private competitors. Giving them more money won't magically make them more efficient. Reasons why include:
- Their incentive is to optimize for political approval, which means spreading facilities among as many congressional districts as possible, which creates a ton of inefficiency from poor communication and the need to constantly ship things around
- Public approval is the goal and failure is the worst possible option, so things tend to be optimized to take as few engineering risks as possible and have huge amounts of bureaucracy to spread the blame for any possible failure
There's a reason why SpaceX started landing rockets with a fraction of the money that NASA spent on building ridiculous boondoggles.
>> Their incentive is to optimize for political approval, which means spreading facilities among as many congressional districts as possible, which creates a ton of inefficiency from poor communication
Ummm.. I thought remote work was no less efficient?
Remote work is fine in some circumstances. One of the circumstances where it is definitely not fine is in designing, manufacturing, and testing high-precision aerospace hardware. You aren't gonna put a 5-axis CNC mill in your garage.
You are comparing company that makes trucks with a company that makes precision scientific instrumers, and you are declaring that truck companu is more efficient per kilo of produce. this is stupid.
Nasa develops nuclear reactors, landed on titan and has reached pluto. Spacex vehicle has never left the Earth-moon system.
SpaceX is not Tesla. It's disingenuous to call SpaceX a "company that makes trucks". Just like NASA, they also make precision scientific instruments. They're the first privately funded mission to the ISS and run a massive satellite constellation.
They may not have the same accomplishments as NASA, but they're far from a "company that makes trucks".
You think their satellite fleet has absolutely no precise equipment for knowing and maintaining its position? Or for communicating with terminals and each other?
Like, come on. I get shitting on Musk is the cool new thing, but this is genuinely the case where SpaceX is doing cool things in space, and at an extremely fast pace. Get over yourself if you can’t see through your Musk hate and only see them as “a company that builds trucks”.
The analogy is apt in at least defining a separation between the overall complexity of what SpaceX produces compared to NASA, to say something of how the two different models of R&D work, but maybe off in degrees as you discussed.
"NASA makes precision scientific instruments and SpaceX makes precision scientific instruments that have higher tolerances with a higher focus on throughput, and there are rapidly diminishing returns in how much funding can be used to close the gap" is probably the right take if not as fun.
One of the things that I think I noticed from the press conference, is that funding is going to be the bare minimum to meet some goal for a design they select.
This seems like a gross mistake.
If we are going to avert a climate catastrophe we will need TW of power to "unburn" the carbon we put into the environment (ocean and atmosphere). Instead of barely hitting this target, we should over deliver since we are running out of wall-clock time.
Every project that meets a bar for feasibility, organizational/operational capabilities (if they dont have it, either fix it, or transfer design to capable team) should be given funding (50-100M). We should be dropping BILLIONS on this, if we can drop 50B+ on semiconductors we can do the same for fusion.
Dump trillions of dollars into fusion energy today and it will still be decades before the first fusion power plant is connected to the grid. You'd be better off funding the construction of fission power plants. Those are very expensive and take years to build, but they're still a hell of a lot cheaper and faster than funding fusion to the degree you're suggesting.
Each dollar diverted to chase nuke wills-o'-th'-wisp brings climate catastrophe nearer.
Money is fungible. Dropping $billions on this means not dropping those $billions on something that works already, works fantastically well, and would work even better with more money. We already know how to prevent (more) climate catastrophe. We just need to do more of it.
Fission means, in practice, paying enough for coal generation, over the decade, to have built enough solar to displace the nuke; and paying many times that, on top, to build the nuke.
So, no. Each dollar diverted from building out solar to mining coal or fooling with nukes brings existential catastrophe nearer.
Simulations and estimations about processes in the physical world always leave room for surprises when one is doing things that haven't been done before.
I think a major bottleneck for fusion research from the lay public (including myself) is lack of interest.
Fission reactors work really well and have been around for 50+ years. If we are going to go nuclear instead of renewable, we need to address the elephant in the room.
The elephant in the room is: why not just fission?
And somehow harnessing the power of the literal sun will have a better safety rate? It's all speculation at this point because fusion doesn't exist yet... but it does seem like a huge undertaking to get superior safety over fission.
It’s not speculation because so much is known about the physics and elements involved. With fusion you want lighter particles which tend to be less radioactive, rather than heavy uranium etc particles for fission.
”Regulatory bodies have vast experience in the realm of safety and security for fission. We are working with them to ensure that all applicable knowledge is transferred to fusion."
Let me put the question another way: suppose we get fusion that's equally as safe as fission like IAEA is saying here. Why would we switch to it if we were unwilling to switch to fission?
You’re making false equivalences, and ignoring all of the fundamental differences between the technologies. The materials involved are different, the chain reactions are different, the kinds of radiation and half lives are different, the way you build the cores are different (and are still being worked on for fusion).
The entire reason why people are working on fusion IS the fact that it’s much much safer due to all of the key differences. That doesn’t mean there aren’t lessons from fission to transfer, just as lessons from ICE cars have gone to EVs.
This is like saying thread count is always the bottle neck in computation. More money allows more parallelism as you can pay for more people and more equipment for more research. As in computing, there are diminishing marginal returns and surely a version of Amdahl's Law for human endeavors.
> thread count is always the bottle neck in computation
The softer the bed linen, the more rested the computer scientists will be and the more likely they are to come up with novel solutions that lead to faster computing.
I would have to agree. The "in general" though is carrying an enormous amount of weight in that statement.
I think what other commentors may be getting at is that in many cases the simple analogy of asking how 9 women can have a baby in 1 month is instructive here. You could throw trillions at that problem, a need to have a baby in 1 month. Sometimes there are hard limits that money has a hard time addressing.
A case could be made that with enough money put towards advanced technology, like gene therapy to force a fetus to maturity in 1 month vs 9, it could be done with horrendous side effects.
So to your point money does solve all problems, but I think diminishing returns is putting it very lightly.
I believe the Manhattan project (where we basically built an entire new city, and entire new manufacturing process from scratch: mining operations, refineries, enrichment, milling, etc.) cost less in constant dollars than the stealth fighter.
The talent + purpose (which drew the talent) was what defined Manhattan. The money came easy when all of the greatest minds in the country were pointing a giant flashing light in one obvious direction.
This is big news but fusion will largely be a product of the people it attracts. The people can do the job attracting money and other talent if it's justified.
I've seen the Canadian gov try to throw money at trying to build a local tech scene and it all went to hucksters, old school finance suits with megacorp resumes, and administrators. While all the tech talent just kept going to SF where the capital was going into high risk ventures... not expensive buildings, events, 'entrepreneur/small business programs', and propping up old school D-round investors. Money is easily wasted even when the pursuit sounds noble and valuable on the surface.
Yesterdays news was that this result was leaked to the FT with apparently preliminary numbers. And today there was a press conference that had somewhat better looking numbers.
I wonder if it might be possible to gain not percentages but orders of magnitude more or less just by making the targets bigger. Is it conceivable that the same basic approach and a comparable amount of input energy could be used to ignite a 100 MJ or even 1 GJ target ? Of course that would present some containment challenges but perhaps not insurmountable ones.
I'm also a bit concerned that this type of research may encounter national security related obstacles. Obviously a pure fusion bomb would be a game changer for nuclear (non-)proliferation.
I don't think a pure fusion bomb will have any form of advantage compared to the current hydro-bombs. They wouldn't produce more energy, but will need more gear to reach ignition.
A pure fusion bomb would produce less (not zero) fallout. Neutron activation would still produce some fallout, but you wouldn't have the fission byproducts like caesium-137, iodine-129 or strontium-90.
This is probably a bad thing; politicians might decide the bombs are clean enough to use.
even without actual radioactivity, pure fusion bombs would still be politically radioactive. look at the fallout (so to speak!) from the Hafnium controversy. they nixed all the research and stopped looking, after realizing that nuclear isomers would do little for energy storage (due to emitting energy as gamma radiation) but lots for bypassing restrictions on fissile materials.
To be clear, pure fusion bombs would still emit massive amounts of radiation. Gamma rays, x-rays, thermal radiation, all off that EM radiation would be emitted just like a regular fission bomb. Neutron radiation too. You'd have less (not zero) contamination of the earth itself afterwards, but everybody in the area would still be very badly irradiated.
I don't know enough about the Hafnium controversy to comment on it.
The advantage would be that you wouldn't need tightly controlled and hard to make materials like U-235 or Pu to make one.
I'm not in any way saying that using lasers would be a plausible route to such a weapon, since the NIF facility is huge, but if it turns out that the research needs to focus on how to get more output per shot, which I think it inevitably would since a typical conventional or nuclear power plant generates on the order of 1 GW thermal power (To match that with a 1 Hz repetition rate, likely a stretch for a MJ class laser, you would need to generate 1 GJ per shot, comparable to the energy in a ton of TNT.), it would probably be touching on areas that are highly classified.
Shiiiit... here I was thinking how cool it would be if they could miniaturize this, having somehow forgotten that my pet solution to the Fermi paradox is that a nigh-inevitable wrung on the ladder to interstellar presence involves discovering One Weird Trick to release a whole bunch of energy pretty easily, even on a DIY basis. Instant end of civilization. Even ant-like societies might have mutated members who'd go rogue and misuse the tech, and it wouldn't take many to ruin everything.
Basically it's a twist on the ice-9 solution to the paradox.
One use for such a 'laser initiated fusion' is the LLNL (Teller) proposed X-Ray laser satellite from the Star Wars program in the 80s. The proposed solution then was to use the X-rays from an exploding bomb at the heart of a satellite and amplify and focus them through a 'lasing' material. It turns out they never found a lasing material that would work, and it would be fairly easy to confuse defeat it, so the project died. What also killed it was the end of the testing program.
This is actually backwards. Fusion weapons are substantially higher yield because they result in more fission, partly by preventing the fission primary from blowing itself up before it has finished.
Wikipedia: "Fast fission of the tamper and radiation case is the main contribution to the total yield and is the dominant process that produces radioactive fission product fallout."
Most of the fission energy in an H-bomb comes from the massive amounts of neutrons created by the fusion bomb initiating a fission reaction in the U-238 tamper of the secondary. The primary is used for its X-Rays, which cause the incredible pressures within the secondary by ablating the surface of the cylinder. When you look at this experiment and see it uses x-rays to ablate case containing the hydrogen, causing an implosion, the purpose of the experiment is clear.
It varies quite a bit by design, apparently the USSR’s initial design was only 15-20% fusion while US designs where closer to 50% which is still apparently the most efficient option in terms of warhead size.
However it’s possible to have higher fusion ratios at the expense of a larger device for the same yield. Most notably in the case of the Tsar Bomba’s which reduced the contribution of fission and too massively reduce the amount of fallout produced.
Almost all nuclear weapons rely heavily on fission of the tamper for yield.
Suggest "Ripple: An Investigation of the World’s Most Advanced
High-Yield Thermonuclear Weapon Design" from the Journal of Cold War studies to read about a predominantly fusion device family.
This seems to say to me that D-T reactions produce neutrons, and that the kinetic energy of the neutrons is smaller than what you get by hitting U with that neutron. You already have the energy from the neutron (which will land somewhere in the system eventually), and you might as well get a multiplier by putting a blanket of U-238 in front of it.
That could be carbon-copied to a fusion power plant, and indeed, there are many proposals of hybrid fusion-fission plants in the literature that only require Q values marginally greater than 1. But if you go that route, you have radiation just like a fission plant, and one starts to question why you don't just build a fission plant (indeed, why don't we?).
My personal pet theory of the future is that, one day, we'll progress so far in fusion research that we get economic energy. But at the same time, the line blurs between both fission and weapons technology, so people are unhappy with the result. This doesn't feel particularly contrarian but no one ever seems to bring it up.
Since you asked: We don't build fission plants because they cost more than every other energy source. Fusion plants, if they could ever be made to work at all, would cost a lot more. So, there won't be any.
Yes but, as far as I know, research into achieving laser induced fusion hasn't itself run into many classification hurdles. I suspect that may not be the case with the scaling phase.
It is interesting that there hasn't been many classification hurdles on something that is pretty explicitly weapons research. My guess is that they achieved high Q values a long time ago in classified research using mechanisms that would give away some secrets.
Well, its not so much in making the bomb, but the particular ways of focusing the X-Rays and particular efficiencies in creating the tamper for the most velocity and thus most compression in the secondary.
The key thing here is the Lasers aren't doing the immediate compression, the lasers are simulating X-Ray radiation which then is ablating the casing around the tritium. Figuring out how to create and amplify a x-ray pulse was a major sticking point in the Star Wars program.
The bigger the target, the more energy is needed to compress it. So it would require more laser energy to get to 100 MJ or 1 GJ, I think. But maybe only a few times more powerful. (Pity they didn't build in some head room!)
The question is whether once ignition is achieved there could be a way to design the target/geometry in such a way that the fusion reaction becomes self-sustaining/self-propagating.
The history of thermonuclear weapons leads me to think that the answer may be yes. There does not seem to be any real upper limit to how large a thermonuclear weapon can be made. In the 50's and/or 60's there were proposals to build GT devices. I don't think the fission trigger for such a weapon could have scaled by nearly as much.
So by analogy if a relatively small fission trigger can cause a fusion explosion that is many orders of magnitudes larger maybe one or more tiny laser induced fusion reactions could be used to trigger a much larger one.
If that were the case the efficiency of the laser trigger would be of little importance.
the unsolved issue with laser fusion (ICF) as an energy source is the fast
degradation of high powered lasers. High powered beam degrades optics on it's path and those things are expensive. ITER has similar problem with superconducting magnets.
I keep getting lost in the numbers here. What was the net gain/loss for the entire system? Without the “lasers are 1% efficient at 20% energy loss with 40% energy transfer loss” and all that.
The net gain of the entire system at NIF doesn't matter, because the system at NIF was never designed to make a net gain.
People are estimating how this result moves the equation for an overall system that is designed for power production. Most numbers I have seen still leave a theoretically optimal power plant producing around a 30% loss in power with this number.
AFAICT: There's a large net gain compared to the energy emitted by the lasers. There is still a considerable loss compared to the energy consumed by the lasers.
While at it: I don't think the NIF approach will ever be applicable to commercial power generation on Earth. But I hope it will be one day applicable to a fusion-based rocket engine.
No, you can have a fusion booster, like in the "Sloika" [0] design, but for a Teller-Ulam design, that is a H-bomb, you use a nuclear primer to ignite a fusion reaction and by far the most energy comes from the fusion part. [1]
You are correct that in a Teller-Ulam design most of the energy can come from fusion, but as a note it's generally a fission-fusion-fission design in "modern" deployments with about 50% of the energy coming from fusion.
"thermonuclear bomb, also called hydrogen bomb, or H-bomb, weapon whose enormous explosive power results from an uncontrolled self-sustaining chain reaction in which isotopes of hydrogen combine under extremely high temperatures to form helium in a process known as nuclear fusion."
Yes, but if you get into the details... The second stage of a bomb is the fusion in which a cylinder of hydrogen and/or lithium is encased by U-238. The primary stage, a trinity-like a-bomb, forces the cylinder to compress by its x-rays. The fusion reaction creates a bunch of energy, but more importantly a huge amount of neutrons. These neutrons cause the u-238 to undergo fission, which is responsible for a majority of the energy and pretty much all of the fallout.
The appropriate analogy for this technology would be that it may be possible to initiate a thermonuclear weapon without relying on fission at all. Currently we use a fission nuclear bomb just to generate the temperature and pressure needed to start the fusion reaction, same as the one on today's announcement.
So far it hasn't proven to be viable, but time will tell.
Not really. I mean, yes, nuclear weapons are a thing, but Dept. of Energy supports many many directions not related to nuclear weapons. Physics research is mostly funded by DOE Office of Science or the National Science Foundation.
I'm no expert, but I think you have it backwards. My understanding is NIF raison d'etre is weapons research, with a power generation being a secondary concern. It got funded because of weapons research regardless of whether it was relevant to fusion power generation.
It may surprise people, but the DOE is the government body that is responsible for nuclear weapons research in the US.
If it was already decided to be funded, yes it would have been under DoE. Though I believe the weapons aspect had a very major contribution in deciding for it to be funded at all. It was proposed shortly after the nuclear testing ban and has been a big part in fulfilling that area.
I'm not trying to correct you, but adding context for the weapons aspect.
Uhhh... it's definitely being used for "stockpile stewardship." The fusion crowd went splitsies with the weapons folks to get this funded in the first place.
The targets are really the secret sauce right? If there were a civil ICF for power program, would NIF designs and data even be available to help, or is it all classified?
There are pictures of CAD and experimental setups for the targets. They’re also pretty open with the setup numbers, so in theory, you could make your own NIF setup and try to get their target designs working.
From what I understand, a lot of the work from the past years has been trying to piece together geometries, pulse timing, stability, and quality of targets.
The Q needs to be something like 500 to 1000, not because of energy breakeven, but to produce enough energy that the shot is financially positive. The amount of fusion energy produced in this shot is worth a penny or two.
(And even then, it's dubious a laser fusion scheme will be competitive with other energy sources.)
Its worth pointing out that per unit energy a lot of money is made making economically unviable cargo ship power, submarine and other military naval power, space ship power sources, diesel-electric locomotives ...
True if you want to replace base load of a civilization size network it needs to be economically viable, but we generate "a lot" of power at higher than market minima. Ironically, "good batteries" are the natural enemy of fusion research.
One fun thing about laser fusion is it theoretically can scale down very low and has a trivial "off" switch making it a good resource for engineering tokamak reactor materials or sensors or similar tasks.
The inner lining of a production fusion reactor is hard to make, so a laser facility would be ideal for research. Which is why we have one...
DT fusion reactors would be terrible for mobile applications, since there are so much larger than fission reactors of the same capacity. In space or mass constrained applications they would be ruinously inferior to fission.
Inertial confinement fusion requires fairly expensive targets to collapse. To make it economically viable they have to produce a lot more energy per target destroyed.
The targets are only expensive because they aren't produced at scale yet.
They are the exact kind of thing a machine could churn millions of per day out, and then use them at the same rate.
Even if the targets were made of expensive materials (eg. platinum), most of that platinum could later be recovered from the reactor wall, so it still wouldn't be very expensive.
"most of that platinum could later be recovered from the reactor wall, so it still wouldn't be very expensive. "
And recovering comes for free?
Every step costs energy (or money).
There is no working design yet. It is waay too early to make any predictions about how scaling could reduce costs. Scaling can even increase costs, if it depletes limited resources like tritium.
It will get there, but it won't be from NIF or via any technology developed for this experiment. What they are doing is not viable for a rector, won't ever be viable for a reactor, and won't even be considered a starting point for any future rector.
It's a fusion plasma research experiment. It's not a program that is being run with the goal of creating a usable fusion energy power plant.
Why should I agree with this article of faith? The obstacles appear quite grave to me. Moreover, even reaching that Q doesn't mean we're there. That's a necessary, not sufficient, condition.
A large, complex machine that explodes the equivalent of 500 lb. bombs to generate heat to drive a turbine sounds like an engineering nightmare.
Because sustainable positive energy out has never been achieved before in 60 years of research. This is gigantic. It’s potential to decarbonize the world is massive, and now it became a whole lot less theoretical.
It’s an incredible milestone, not a solved problem.
That's a circular argument. It's big because the people doing it call it big. Why should I, an outsider, care about their internal goals, their egoes, or their status in their field? What does it do or imply for me?
To achieve fusion for power production, you need more output than input. For 60+ years this hasn’t been achieved in a replicated fashion. Now it has, and it’s 50% more power rather than 0.1% more power as was sometimes shown for 2 nanoseconds before. So now we know fusion for power is possible. If it can be scaled successfully (now likely not an if anymore, but a function of time), then we have the ability to have clean and safe energy 24/7. That would help mitigate the worst of climate change, and if cheap, turbocharge the entire economy.
The issue is, we've been able to get more energy out of a fusion reaction than put in for 60-70 years now. The H-Bomb is very good at doing that. You will say, yeah, but an H-Bomb is a one time use thing, and it has a habit of destroying everything. If you look into how the second stage of an hbomb is speculated to work, its pretty much identical to this experiment, that is by design, not by accident.
Herrmann acknowledges as much, saying that there are many steps on the path to laser fusion energy. “NIF was not designed to be efficient,” he says. “It was designed to be the biggest laser we could possibly build to give us the data we need for the [nuclear] stockpile research programme.”
I'm asking why this somewhat arbitrary line being crossed is something I should care about. It doesn't imply fusion will reach a state of practical application. Why is this more exciting that achieving a ratio of .1, or .5, or 2, or 10? It seems entirely arbirary to me, and smells of an argument that somehow this has made the end goal significantly more attainable.
>why this somewhat arbitrary line being crossed is something I should care about.
That is something personal and unique to each individual. In 1903 when the Wright brothers flew a heavier-than-air machine for 59 seconds, 99.99999% of the people on the planet wouldn't have cared. The airplanes you've flown on are vastly far removed from that original one. Same story for the point contact transistor in 1947. None of that solid state physics is used for modern transistors. Some people like to be early adopters for new ideas and things. Some don't. And that is OK.
Because until now contained ignition has never produced anything meaningful. We've had failed experiment after failed experiment. Now we finally have an experiment with a meaningful more amount of energy out than in.
Is this the right approach? Who knows. There are many fusion designs in the works, and those may ultimately be the right call. Or some yet-to-be-created design. That's even probable. The NIF is for simulating nuclear weapons, not creating energy. None of that takes away from this breakthrough - we've never had meaningfully more output than input on a repeatable basis. It's proof that contained fusion for energy isn't just hypothetical, which will also mean funding & interest will generally increase from this point on.
I think you're setting too high a bar. It's like saying no milestone should be celebrated until we have a working metropolitan-size plant running that's cheaper than anything else. Punch cards in the 1950s are insignificant compared to modern SSDs, yet they were an important step even though we don't use anything like it now. Breakthroughs are breakthroughs.
> It's big because the people doing it call it big.
How does "[It's big b]ecause sustainable positive energy out has never been achieved before in 60 years of research" translate to "because we say it's big" in your head?
You might not consider it big, but a specific reason was provided and it had zero similarity to your rephrasing.
An answer matched in tenor and tone to the question, but nonetheless entirely serious,
is that because while the obstacles are grave, the consequences of failing to overcome them are much graver still,
and to the best of our collective knowledge,
industrial scale fusion would be the least bad answer to our energy demands for the next epoch.
That is true but also does not obviate the need for other parallel efforts and other technologies whose challenges are also very grave, e.g. the need for very near term very large scale carbon sequestration, for a modern electrical grid with deep redundancy and resilience, the need for effective safe scalable stores for energy from whatever source, etc.
While I'm sure that you're correct, the obstacles are large and there is a lot of overcome still, I can't help but think of James Watt & (my ancestor) Richard Trevithick - the inventor/pioneer of the compact steam engine.
Watt went around telling everyone that Trevithick and his compact (ie high pressure) steam engines were too dangerous and would never work.
Yes, some exploded. But then we got steam trains and even today almost all power generation on the planet is high pressure steam-electric power plants.
> A large, complex machine that explodes the equivalent of 500 lb. bombs to generate heat to drive a turbine sounds like an engineering nightmare.
And using actual bombs and explosives to dig kilometers down and mine coal is not an engineering nightmare? Dying of gas in the mines, fires on oil wells, oil spills, these things are 'engineering simple'?
We don't place precision optics in those blast zones. We don't put structures there that are repeatedly exposed to blast. Over the life of a inertial DT fusion reactor there will be about a BILLION such explosions in the reactor core.
Q is irrelevant, you need throughput. If your Q is one million, but you are processing one tiny capsule per second, you are producing too little money to pay for the facility.
If you can process a tanker worth of hydrogen per second, Q can be just above break even and you will still make money.
Irrelevant? Seems like Q is one of two factors in that calculation. If the throughput is tiny, you're useless, but if your Q is too small, the same is true.
The higher the Q, the lower throughput needed for feasibility.
Exactly. Beyond power generation, humanity still uses petroleum products in their chemical industry. Which is why the shutoff of Russian natural gas hurts Germany much more than other countries, they now have a starving chemical sector.
I'm confused by this. Does the US have the productive forces and resources to replace petrolium with solar panels (and the required energy storage)? Does it have the nuclear fuel to replace petrolium with fission reactors?
What alternatives to petrolium does the US have that it does not rely on others for?
What scheme do you imagine that fusion could be used to replace petroleum that would not also work when powered by solar? Production of synfuels using hydrogen, for example, would also deal with solar's intermittency, leaving the energy sources to compete on the basis of levelized cost. The levelized cost of solar has become quite low, and it's very difficult to see how any fusion scheme, and DT fusion in particular, will ever compete.
I specifically asked about the production of solar panels. Are you assuming that we already have all the panels we need to replace petrolium sitting in a warehouse? What good is solar in an energy independence plan if we can't build our own panels?
Nuclear fuel actually isn't that expensive or rare
Those crazy sci-fi stories from the 30s and 50s where everyone used nuclear power (and it was so cheap they didn't bother to meter it) were all completely accurate from a non-political viewpoint
Foreign energy reliance is finished and has been for some time. North America can produce more petroleum energy than it uses. In both 2020 and 2021 the US was a net petroleum exporter.
Yesterday, everyone was complaining about the 2.2:2.0 ratio, but now we're working with 3.15:2.05.
With modern lasers, that'd be a total Q of 0.375 assuming 100% efficiency through direct-energy-capture.
The jumps to get here included
- 40% with the new targets
- 60% with magnetic confinement
- 35% with crycooling of the target
The recent NIF experiments have jumped up in power. The first shot that started this new chain of research was about 1.7 MJ of energy delivered. Now, 2.15 MJ. However, the output has jumped non-linearly, demonstrating the scaling laws at work.
> I’ve helped to secure the highest ever authorization of over $624 million this year in the National Defense Authorization Act for the ICF program to build on this amazing breakthrough.”
It's nice to see this milestone recognized, even if the funding it still rather small.