Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So we think about the engineering problem and reject the entire concept on that basis.

Even if I give you a magic box full of near vacuum 100 million degree plasma that emits 1MW/m^3 in the form of neutrons whenever you want as long as you stop it from touching the sides with supercooled magnets. How do you turn that into a remotely viable power station?

Just the heat exchanger and steam turbine portion is going to be uneconomical against renewables+storage.



Yes. Just just keeping a steam turbine up and running, completely ignoring where the heat comes from, costs more than both building out and operating wind and solar.


Sure, as long as you skip storage. Batteries are really expensive if you're using enough to get through windless nights instead of just compensate the duck curve. Long-distance transmission is expensive too, and to get through winter you need either huge storage or lots of overcapacity. It all adds up.

At this point someone usually says "we don't need batteries for long-term storage, we can just make hydrogen and burn it later" but then oops, you're back to using a turbine.


It is correct that batteries are the most expensive storage. One corrolary is that other storage is cheaper. Another is that other methods will be used instead.

Storage is an integral part of a renewable-powered grid. It is figured into all the cost analyses. Insisting otherwise is promoting known falsehoods. Lying is unwelcome here.


Hence my second paragraph.

The only other major option is using pumped hydro for storage, but not every region has the geography for it.


Pumped hydro is very far from the only alternative to batteries. But you have been told that before, too.


"Hence my second paragraph" which was this:

> At this point someone usually says "we don't need batteries for long-term storage, we can just make hydrogen and burn it later" but then oops, you're back to using a turbine.

There are various experimental options too but I don't think any have deployed at scale, so their costs are uncertain. And some of them, like thermal storage, will still use a turbine.


Turbines cost less to operate part-time than full-time. One operated 4 hours / day, or one day / week, costs a lot less than one run continuously.


I'm gonna need a source on that. It seems to me that turbine cost is mainly capital cost, and if you're not running it continuously then your cost per kWh is higher.

After all, that's the argument you renewables folks make when you say nuclear plants can't load-follow economically. So tell me why a turbine run intermittently is cheaper, and prove it with a source.


Allow me to invite you to look up steam turbine overhaul cost and frequency. Do mothballed turbines need frequent overhauls? Or will they need overhauls after N hours' use at intensity Y, or after M > N operated at intensity X < Y?

Steam turbines are a substantial fraction of nuke operating cost, but far from the only substantial cost.


An idle turbine also isn't earning revenue.

I haven't found specific numbers on steam turbines, but this pdf says a gas turbine's maintenance cost over 30 years is 3 to 4 times the capital cost:

https://www.gasturbine.org/papers/GTA%20Comments%20on%20RMRR...

So taking the worst case, capital plus maintenance for a continually-running turbine is 5X its capital cost, to get 30 years of power output. For each year of power, the total cost is 5/30 or 1/6 of the capital cost.

Now lets take your 4 hours/day example. That's 1/6 as much running time, so only 5 total years of power output. It's also 1/6 as much maintenance cost, so maintenance is 4/6 of the capital cost, for a 30-year total cost of 1.66 times capital cost. Total cost per year of power is 1.66/5 = 1/3 of the capital cost.

That puts the cost per kWh for energy storage at double the cost of the continuously-run turbine.


We mostly already have the combined-cycle gas turbines, and are burning NG on them, so their capital cost is already sunk.

Until enough renewable generation is built out, it will be silly to build storage, so the turbines will continue burning NG, just increasingly mainly at night and on dark, calm winter days. As their duty cycle declines, their total annual operating cost falls in proportion. As chemical synthesis and compressed or liquified air storage gets built out, later, the fraction of their (reduced) operating time they spend driven by those instead of NG increases. As non-synfuel storage capacity (battery, pumped hydro, mineshaft gravitic, buoyancy) increases, running time and maintenance load declines further, as those pick up more load. Synfuel and compressed air will be mixed into NG in proportion increasing with stock on hand, rather than running on one fuel for a while and then another. Eventually NG falls out of the mix.

Operating the turbines only when the renewables and non-fuel storage are not supplying enough power cuts the maintenance load, therefore operating cost per unit calendar time. Not running does not mean you are not getting revenue: revenue comes in for the renewable-generated power, produced at near-zero marginal cost.

Running a turbine only sometimes extends its total life, so its capital cost is amortized over just as many kWh produced, either way, just longer in, again, calendar time.


That's tolerable compared to fission.

You could model it roughly as adding the capex cost of half a turbine or so and x% of uptimes to your solar/wind capacity. Lifetime would also be longer but anything after 2060 or so is largely irrelevent for slowing climate change so we'll ignore it.

IIRC O&M + Capex for a CCGT is about $20/MWh and fuel is around $30 which would put the capital around $5 and we don't expect to operate it for long as it's a backup.

So the generator should cost us <$10 per MWh of solar + wind we use (with the majority being used directly or via batteries/hydro). Green fuel will cost about 2x the renewable energy with capex/O&M on the same order as the turbine or maybe a bit lower (ammonia electrolyzers are estimated about $1000/kw with current tech but you'd only need to put half or less of your peak capacity through it. Additionally they will run most of the time producing fuel if the target is using the fuel as the last resort backup).

With projected solar + wind costs around $20/MWh using chemical storage for winter would then have a total cost of $60/MWh for energy that has been stored as ammonia and $40 for energy that has not in a somewhat pessimistic ballpark with price dominated by fuel generation. On par with some current full time gas, but at the upper end. Far less than fission if we use the same 'joules before 2060' constraint, but enough to cause difficulty building out new gas turbines if we don't presently have enough.

Fusion isn't really in the running as a steam turbine is much more expensive than a gas one so we're getting close to parity with the renewable + battery + gas turbine just from the steam turbine step.


From a quick google, steam turbines cost about 50% more than gas turbines.[1][2] Gas is $500-700/kW, steam is $670-1140/kW.

That's easily compensated by the higher capacity factor of non-storage applications (since storage turbines, calculated above, cost 100% more per kWh due to idle time). Also, storage has to add the cost of the electrolyzer and ammonia storage tanks.

On top of that, round-trip efficiency of energy storage via ammonia is only about 30%.[3] So if your solar farm costs $20/MWh, your cost of stored energy is over $60/MWh before accounting for turbine/electrolyzer costs.

This means you can easily be competitive with a steam turbine plus a heat source that costs $60/MWh, at least for baseload.

[1] https://www.greentechmedia.com/articles/read/ges-new-gas-tur...

[2] pdf: https://www.energy.gov/sites/prod/files/2016/09/f33/CHP-Stea...

[3] https://www.ammoniaenergy.org/articles/round-trip-efficiency...


Not the same kind of turbine. Internal combustion turbines are more cost-effective than steam turbines.


Right, steam turbines cost about 50% more. But a turbine for energy storage is likely to get a much lower capacity factor than a turbine on a generator, so it evens out. (Sources and math in another thread below my above comment.)


TEGs are pretty neat, but the temperature and energy flux is very hard on the materials.

If only there was some way to get energy from a fusion reaction via a semiconductor junction by running them in parallel at low temperature rather than in series at extremely high temperature, and a long way away so the neutrons could thermalize and you could absorb much easier to handle photons. Oh well.


I'm exploring such a concept, NanoFusion.

A study device and thinking from first principles about the reaction.

There are higher energy reactions like proton-boron that emit only light and charged particles. Eric Lerner's Focus Fusion approach taught me about that.

I'm looking at parallel delivery of single photons and how that scales as laser systems develop. Lasers themselves are on an exponential curve, so that seems promising.

Fusion batteries would be cool!

https://docs.google.com/document/d/1B5maRx9w0ahQTqSe6UZ56rd0...


Sustained p-11B fusion might not be possible: the gamma photons carry away too much energy, and nothing can reflect them back.

But if it could be made to work, it would be a better prospect than D-T.


Two ideas:

1) don't try to achieve a chain reaction, but do single atom reactions and fully cycle the energy. Simpler to think about, and raw energies add up (see link).

2) if chain reaction needed, orient the reaction to aim the outputs towards their secondary targets e.g. with a crystal lattice fuel package.


Where do the photons come from?


The usual place. Surround your fusion reactor in a very thick blanket of opaque hydrogen plasma so the neutrons can bump into something.

Contain the whole thing in some kind of force field (if only there was some alternative to the EM force for this purpose), then when the photons hit your photon electric generators you get electricity.

If you arrange things such that the outside of the opaque hydrogen blanket is about 5800 Kelvin and your photon-electric generator panels are spread out in parallel to...let's say about a kilowatt of energy per square metre they should last a couple of decades and your efficiency will be okay, maybe around 20-30%

There. Fusion power plant invented. Sadly it's just a pipe dream though because we couldn't roll it out without first waiting for a stable fusing plasma source on earth.


> Surround your fusion reactor in a very thick blanket of opaque hydrogen plasma so the neutrons can bump into something.

If you compute how thick that plasma would have to be to stop neutrons you'll realize this idea is totally unworkable.


How does neutrons bumping protons produce light?


It heats them up (The joke is I'm describing the sun and solar panels as the end state of trying to overcome the technical hurdles of building a heat engine to extract electricity from fusion if you're not just playing along).

Jokes aside, TEGs are pretty neat. In addition to harvesting what is currently waste heat in someplaces or having applications in geothermal or solar collectors they might even make fission not-stupid. Can't forsee any way fusion works as practical power generation this century though. It has worse power density than chemical and much higher temperatures to deal with and much higher neutron flux than fission as well as needing massive 2 kelvin magnets right next to the 100 million degree plasma that explode if they warm up to 3 kelvin.


Solar fusion doesn't produce neutrons hot or otherwise. It produces gamma rays, helium nuclei, and neutrinos.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: