What exactly do you find unconvincing? Just comparing grid storage needs vs EV’s suggests battery production capacity isn’t going to be an issue so it simplifies to cost across say 20+ years.
There’s a lot of very simplistic analysis that makes things seem vastly worse but you don’t need crazy assumptions for things to look reasonable. Basically the more sources of energy you can add to the mix the cheaper things become as you only include them when you save money. Thus if you ignore Wind in an estimate adding wind can only lower costs, ditto for nuclear, geothermal, etc.
Hydro is reliable and better than continuous power because it’s flexible enough you can fill in gaps based on what you forecast production will be across the next week. US only gets 6.2% of its power from hydro so the rest of the world actually looks better than these numbers.
How much Solar to build is part of a complex optimization problem, simplify it 2c/kWh of capacity but 1/2 of all solar power is wasted over a year ~= 4c/kWh. Further, let’s assume 1/2 of all solar is used directly with the rest available to charge daily batteries again pessimistic with that kind of oversupply and HVDC lines etc but whatever. 50% - 5% (hydro) = 45% of total demand from batteries. (Wind drops this by a lot.)
Projecting LCOE for batteries and Battery degradation is closely correlated with use so 120$/MWh is again high long term.
Now our worst case is something like 4c/kWh (solar/hydro) + 12c/kWh (batteries) * 45% = 9.4c/kWh for 24/7/365 power with zero fossil fuels and zero nuclear. On the surface that might seem terrible or awesome depending on what your local rates look like, but this was a very pessimistic estimate.
Our pro nuclear investor needs to be pessimistic in the other direction. Remember demand is shifted to cheaper nighttime rates, if daytime electricity is cheaper then nighttime use drops further. Baseline numbers of say 4c/kWh production + 80$/MWh batteries * 30% = 6.4c/kWh 24/7/365. If you’re considering building nuclear and projecting 60+ years out those kind of numbers represent a real threat without any dramatic breakthroughs.
PS: You can plug your own assumptions in here, but remember people are trying to minimize costs and we already have existing infrastructure that isn’t being replaced that quickly. I doubt we’ll average under 20% natural gas within 20 years due to flexibility, which dramatically reduces the need for batteries and excess solar. That sounds bad, but mix in significant EV adoption and it still becomes a vast reduction in CO2 emissions, while even lower costs discourage Nuclear.
Most of the estimates I've seen for storage requirements for current electricity usage are not optimistic[1]. When you add in the huge amount of energy used for heating, which is currently not electric, it gets even worse.
I'd like to see an analysis of how much generation and storage is required to handle Minnesota's natural gas usage[2], with a reasonable guarantee that power will not be lost for more than an hour or two during the entire yearly six month cold weather span. Are those generation & storage estimates reasonable? How much will they cost, how reliable are they, how much room will they take, how many natural resources will they consume? Keep in mind Minnesota will not be able to bogart the entire battery manufacturing capacity of the globe, and remember that this analysis is in addition to the amount required to meet current electricity usage needs. Then, extrapolate that analysis to all other cold weather areas, such as other northern US states and most of Canada. Is renewable+storage really feasible to meet that need?
I've never seen this analysis done, and what I've seen from other analyses makes me think it's not feasible without depending on unproven future tech. But I'd love to be wrong!
[2] Approximately 500 billion cubic feet of natural gas per year. Note also that the usage is not evenly distributed through the year, we use a lot more in February than we do in July. https://www.eia.gov/dnav/ng/NG_CONS_SUM_DCU_SMN_A.htm
The person you are quoting in [1] is a chronic bad faith actor. Examine his arguments with great skepticism. The argument he is making there is obviously wrong -- he's implying (without justification) that battery production cannot be greatly expanded, and ignores non-battery storage.
I don't agree with that characterization, but regardless, that was just a handy link. There's plenty of well-justified skepticism regarding storage out there. Either way, I'd love to see the analysis I suggested. If it's that clear-cut, surely it's not too hard for someone better informed than me to put together the numbers.
I think if you really dig into the arguments he is making, you will spot the evasions and non sequiturs, and conclude he's not arguing in good faith.
There is so much peer reviewed work saying the opposite of what he claims that you should default to skeptical about him. It's the same way one should treat a creationist.
There’s a lot of very simplistic analysis that makes things seem vastly worse but you don’t need crazy assumptions for things to look reasonable. Basically the more sources of energy you can add to the mix the cheaper things become as you only include them when you save money. Thus if you ignore Wind in an estimate adding wind can only lower costs, ditto for nuclear, geothermal, etc.
Hydro is reliable and better than continuous power because it’s flexible enough you can fill in gaps based on what you forecast production will be across the next week. US only gets 6.2% of its power from hydro so the rest of the world actually looks better than these numbers.
How much Solar to build is part of a complex optimization problem, simplify it 2c/kWh of capacity but 1/2 of all solar power is wasted over a year ~= 4c/kWh. Further, let’s assume 1/2 of all solar is used directly with the rest available to charge daily batteries again pessimistic with that kind of oversupply and HVDC lines etc but whatever. 50% - 5% (hydro) = 45% of total demand from batteries. (Wind drops this by a lot.)
Projecting LCOE for batteries and Battery degradation is closely correlated with use so 120$/MWh is again high long term.
Now our worst case is something like 4c/kWh (solar/hydro) + 12c/kWh (batteries) * 45% = 9.4c/kWh for 24/7/365 power with zero fossil fuels and zero nuclear. On the surface that might seem terrible or awesome depending on what your local rates look like, but this was a very pessimistic estimate.
Our pro nuclear investor needs to be pessimistic in the other direction. Remember demand is shifted to cheaper nighttime rates, if daytime electricity is cheaper then nighttime use drops further. Baseline numbers of say 4c/kWh production + 80$/MWh batteries * 30% = 6.4c/kWh 24/7/365. If you’re considering building nuclear and projecting 60+ years out those kind of numbers represent a real threat without any dramatic breakthroughs.
PS: You can plug your own assumptions in here, but remember people are trying to minimize costs and we already have existing infrastructure that isn’t being replaced that quickly. I doubt we’ll average under 20% natural gas within 20 years due to flexibility, which dramatically reduces the need for batteries and excess solar. That sounds bad, but mix in significant EV adoption and it still becomes a vast reduction in CO2 emissions, while even lower costs discourage Nuclear.