They listed 5 challenges but they forgot probably the biggest one-- Make it cheaper than the alternatives. That might never happen even if we do solve all the material and technical problems.
Disclaimer-- I did my masters on fusion reactor simulations.
This, it's mostly an economic challenge. We need clean energy that is cheap. Not clean energy that is more expensive.
It's important for clean energy to be cheap because that creates an economic incentive for people to invest in switching.
The good news with renewable energy and battery storage is that we're basically there. Texas which produces lots of oil also produces lots of clean energy. Because oil refineries require a lot of energy to refine their fuel. And of course using your own product at a premium when clean renewables do the same job would be madness. Think about it, the very companies that are expecting others to buy and burn their fossil fuel are investing (heavily) in renewable energy because that's cheaper than using their own cheap supply of oil and gas for the same purpose. It's more lucrative to sell that than to use it for them.
That's why records are being broken with amounts of solar, wind, and battery coming online on a regular basis. It's cheaper. So, people do more of it as fast as they can scrape the funding together to do it.
Fusion if it happens at all is probably going to be really expensive for quite some time. Nuclear has the same problem. Clean but just super expensive. And slow to plan and execute. You need to decide decades in advance that you need it in order to have it ready. Most reactors coming online this year (hint: not a lot) were planned when solar and wind were still expensive (1-2 decades ago). In other words, they probably did not take into account current price levels of new clean energy. In terms of profitability these things are kind of dead on arrival. Fusion faces the exact same challenges. It's not completely hopeless. But we need 1-2 orders of magnitude reduction of cost per energy generated this way.
A fusion reactor is going to create and consume large quantities of tritium, which is difficult to control because it is an isotope of hydrogen and hydrogen can go right through metals. It might not really be a problem but it is always going to be in the paper that somebody detected tritium around a reactor.
Also, because fusion reactors produce large quantities of neutrons, you could put U238 or Th232 into the blanket and breed either Pu239 or U233. Back in the 1980s some engineers thought that a hybrid fusion-fission system could have better economics that either alone if you used the bred fuel in thermal reactors. Those materials could just as well be used to make bombs, although, unlike a fast breeder cycle, a fusion reactor doesn't require plutonium processing technology in order to work.
---
As for solar and wind being "cheap" don't believe the hype. I mean, it is cheap if you don't mind that it is only available when the sun shines and when the wind blows. You see estimates for solar + storage that are deceptive and thought-stopping like Case 17 of [1] (an hour and a half of storage is significant, but it is not comparable to the 24 hour service that fossil fuels and nuclear can provide.)
I haven't seen a detailed analysis of the generation + transmission + storage costs of a 100% renewable system published in the last decade. If you try a back of the envelope calculation it usually winds up worse than Case 9 from that report, except that Case 9 is optimistic because (so far) it always costs a lot more than you estimate to build a nuclear plant. [2]
[2] to get some sense of how bad the problem is you need: (a) to get through the night (12 hours of storage? still might beat AP1000...) and (b) get through extended periods of bad weather (2 weeks of storage?) and (c) get through the winter (overbuild solar 2-3x? otherwise 6 months of storage?)
About the worst you can find in terms of small geographical area and having to deal with both Dunkelflautes and arctic conditions spilling south which often leads to -5C to -15C and near zero wind.
Of course excluding that storage is plummeting in price with the latest Chinese auctions with bids submitted December 2024 landing on $62/kWh installed and serviced for 20 years.
> In this report, where we make a comparison between the costs of variable renewables such as
solar PV and wind and the costs of other technologies we include the cost of firming those
renewables which we call integration costs. These are the additional costs of ensuring supply is
reliable when using intermittent energy sources. These integration costs are itemised in the report
and include storage, transmission, system security and spilled energy
Handling the general problem is made much easier with e-fuels to handle very long term storage and rare outages. The cost can be estimated and isn't that high, even if the round trip efficiency of storage with e-fuels is not very good.
BTW, the battery cost estimates there in that link seem very high, not reflecting the recent collapse in LFP prices from China.
> This, it's mostly an economic challenge. We need clean energy that is cheap. Not clean energy that is more expensive.
Cheap and expensive in dollar terms is meaningless. Dollars are numbers in computers. Joules and watts are real.
What I can agree with we need good enough EROI energy and then adjust the policies regarding fusion, nuclear and fossil joules, watts and their related numbers in computers.
You are right with the policy shift. But I would suggest to stop throwing good money after bad money instead until the sector massively improves its accuracy of numbers it does publish on such things like budgets and timelines for new projects.
Because most of the numbers coming out of the nuclear sector for this are too good to be true. To put it mildly. It's tempting to use more harsh language here. Like "fabricated nonsense" or "a scam". And some people do.
Probably the biggest challenge for nuclear proponents aside from demonstrably high cost, overruns, wildly optimistic timelines, etc. for past projects is the credibility of their predictions about cost for future ones. The lower those predictions, the smaller the chance that those are anything more than wishful thinking mixed with a lot of hopium and based on very flawed assumptions.
When cost overruns in nuclear projects are routinely measured in hundreds of percents, something is very wrong. It would be nice to get that down to two digit percentages. That would still be terrible but a lot better than the industry norms for this. Especially compared to renewable projects.
This credibility challenge is a large part of the reluctance of investors to fund new efforts in this space. Most of the projects that actually still happen feature governments and tax payers taking most of the risk. Which makes it more likely that the huge budget gaps get funded rather than resulting in investors cutting their losses and bailing.
If you want numbers, look at numbers of GW coming online for nuclear, wind, solar, and energy storage projects. There's a stark difference there. Renewables are outpacing nuclear at such a pace that you could legitimately wonder why to bother with such projects at all.
> Renewables are outpacing nuclear at such a pace that you could legitimately wonder why to bother with such projects at all.
Indeed, especially as renewable will more and more eat into nuclear 'baseload', lowering its load factor and therefore raising its total production cost.
https://www.youtube.com/watch?v=udJJ7n_Ryjg
Fully agreed. It took us from 1944 to 1965 in fission to go from "ok it makes net energy" to "ok we can make electricity on the commercial market within the realm of the existing alternatives". Fusion is still pre-1944 in that they haven't even demonstrated getting net energy out, let alone put together an electricity-generating machine. I have no doubt at all that we'll make fusion-derived electricity in my lifetime. But I have no idea whether or not we'll make it cheap.
They say the fuel is cheap. It's not (tritium and He-3 are hard to come by), but even if it were, keep in mind that a fleet of uranium-fueled fission reactors like the one in France spends about 5-10% of its costs on fuel.
"Within the realm" involves some externalities though. Fission reactors have never been particularly cost competitive with other electrical generation technologies. They were subsidized heavily because people wanted reactors to build bombs, and more broadly to cultivate a technology base of people who could build and maintain reactors to build bombs. Once we had enough bombs, that subsidy got rolled back and fission fell off the leaderboard.
If you spend as much time reading the early 1950s documentation of the nuclear industry, you'll quickly see that commercial nuclear reactor development was very much decoupled from the graphite or heavy-water moderated bomb-making reactors. They separated around 1953 and never came back together. Back then, the fledgling industry was wondering if they could sell excess plutonium for bombs, but the government said that they were not interested in buying it. They said they had more than enough coming from their own specialized bomb-making reactors.
Certainly nuclear tech was subsidized originally by the Manhattan project, but very soon after, bomb making and nuclear power went their separate ways.
You could much more strongly argue that the nuclear reactor development made to power submarines helped develop the commercial nuclear power industry. This checks out much more cleanly, as Shippingport was led by Rickover himself.
That is a bit revisionist. Most of the investment went towards reactor technology and research that had a dual purpose of generating power but also being aligned with generating plutonium. That's why the type of reactors used in nuclear sub marines was never commercialized.
Simply too expensive for civilian use and they never had a dual use that benefited generating material for nuclear bombs. Commercial reactors today are based on minor variations of nuclear fission that was also strategically relevant for getting plutonium. Different reactors but same mineral supply chains and a lot of overlap in R&D. As soon as the cold war ran to an end, funding for all that fell off a cliff. Including building new nuclear power plants. Coal and gas plants were vastly cheaper to build. And they still are. And back then, nobody cared about pollution.
Only now that people are interested in small reactors are these decades old fission reactors getting more attention.
The PWR was and is used in submarines. It led to Shippingport which led to Yankee Rowe and Indian Point 1, which launched the commercial nuclear power industry.
Conversely the graphite moderated Hanford reactors that made plutonium never led to commercial reactors. Closest you got was Hanford N but that was kind of a one-off.
If it matters, I'm a professional nuclear engineer with a lifelong interest and expertise in reactor history.
The PWR (and related BWR) are bad at weapons plutonium production for two reasons: (1) there aren't a lot of excess neutrons to transform U238 -> Pu239, graphite reactors can be tuned up to make about twice as much Pu. (2) Either kind of LWR runs under pressure and it would be a big deal to shut the reactor down, bleed the pressure, and change the fuel; most other reactor types can be refueled online. The longer Pu239 stays in the reactor, the more of it adsorbs another neutron and becomes Pu240 which is bad for nuclear weapons because it creates a neutron background that causes predetonation.
Graphite reactors have a history for power production: see [1] and [2]. They might even have a future [3] but the German experience makes me think the prismatic type core makes more sense [4]
The PWR used for power [1] is to first order "the same type of reactor used in submarines" in the sense that it is a thermal (in terms of neutron spectrum) reactor which is cooled by pressurized water.
The submarine reactor uses highly enriched uranium and, I believe, burns like a cigarette, which makes it easy to start and stop the reactor without worrying about [2]. It doesn't produce a lot of Plutonium because it doesn't have much U238 in it. I believe the fuel elements are horizontally oriented. It's built tough because it might get knocked around in a war.
The commercial reactor has vertically oriented fuel elements and tries to burn the fuel uniformly along the axis because this minimizes cost, because minimizing cost is a matter of maximizing power for a certain sized reactor core, and maximizing power is a matter of maximizing heat transfer out of the interior of the fuel rod and to the coolant, so you want to be producing and removing heat from the whole volume of the core. Temporal uniformity helps with spatial uniformity so the reactor is not so good at starting and stopping, but it can follow loads [3]
Notably the west (where natural gas is available) quit building coal burning power plants at the same time it quit building nuclear power plants, partially because of pollution [4], partially because a 100% steam turbine powerset as used in coal and nuclear plants can't compete with a gas-fired combined cycle powerplant which combines a very-low capital cost gas turbine with a steam turbine that is about half the size (and cost)
The small LWR is a pipe dream because LWR economics improve as the reactor gets bigger. One reason why we can't build reactors on an N-th of a kind basis is that every time we build one we decide the economics weren't that good and they might get better if we scale the design up. Had NuScale ever built a plant they probably would have tried scaling it up too.
It is claimed small reactors could be easier to build but until somebody actually builds one, this claim is hard to take seriously.
One path that makes sense is to pick an optimized design for a large reactor and "get good" at building it, it could be an advanced Western design like the AP1000 or something like the Chinese Hualong One which is an improved version of the reactors that France built in the glory days. The Russians are really good at building the VVER, their version of the PWR.
The other path is to build a reactor that works at much higher temperatures such as the fast breeder reactor, high temperature gas cooled reactor, or molten salt reactor and couple it to a gas turbine powerset or use the heat to produce hydrogen directly. [5] This is still decades away.
[1] Pressurized Water Reactor, there is also a Boiling Water Reactor in which the water boils in the core but it's not a vast difference: these are both called LWRs (Light Water Reactor)
Nuclear power has always been heavily subsidized by the federal government. I'm not making a narrow point about bomb technology per se, I'm pointing out that the industry only exists because the government wanted it to exist. And that the government wanted it to exist because making bombs (and "nuclear" stuff more generally) was perceived to be an issue of national security. And once that perception shifted, the subsidies shifted and fission couldn't compete.
Your response amounts to "the government was mistakenly subsidizing civilian nuclear power". Which might be true, but doesn't refute the fact that they were.
The government spent money researching it early on, such as for the space program. Then the government has outright subsidized solar installations on roofs, etc. Part of the solar miracle of the last 15 years have been huge demand-side subsidies by China because they want to corner the market.
So, sure, and you'd absolutely take those into consideration when deciding whether solar is cost-competetive, right?
That said, recognize that there's an important difference here. Virtually all solar subsidies are on the demand side of the equation. They pay homeowners to buy solar, and act to increase the size of the market. They don't make solar cheaper to produce (except via second order effects due to the resulting scale).
Nuclear wasn't like that in the 60's and 70's. The government was handing the checks to the reactor builders, not the consumers.
You keep moving goalposts. The US nuclear industry in its early decades was absolutely subsidized at the supply side, making reactors cheaper and creating the illusion that the technology was cost-competitive with fossil fuel generators, when it never was and almost certainly never will be.
Solar doesn't have that property. Solar is in most areas reasonably competitive already, and the subsidies work towards increasing the size of its market.
> illusion that the technology was cost-competitive with fossil fuel generators
The hope, then, was to quickly obtain an industrial breeder reactor in order to considerably reduce (or even suppress any) dependency towards uranium providers (there is no known similar trick for oil/methane).
It failed, and it may be a major cause of the current nuclear demise.
It was also as a way to make the enormous expenditure on nuclear weapons and (even more so) nuclear delivery systems seem more acceptable, because it would have this putative valuable civilian spinoff. The space program had a similar motivation.
I'd note that the externalities may change that calculus a bit. The relative pollution from coal/gas/oil are huge but the cost isn't reflected in the price at the meter.
However, 99% of fly ash is captured in electrostatic precipitators and filters. Calculations that include the radionuclides in the captured ash are not being honest.
But we can ask: if this is the important issue justifying nuclear, which is more cost effective: adding pollution control to coal plants, or building nuclear plants?
If it's not the important issue justifying nuclear over coal, why bring it up?
Nuclear was the only viable green option back then. It failed because of mass psychosis and because it was more expensive than coal and gas.
Of course it’s too late now. But had nuclear plant construction continued to grow at the same pace as in the 60s and 70s CO2 emissions from power generation would have be been solved by the 2000s.
> adding pollution control to coal plants
Even if it was 100% effective that would do nothing to slow down climate change.
The "mass psychosis" theory is comforting to the zealot, but not in accordance with the facts. Nuclear failed for economic reasons. In the US, these reasons included: cost escalation (even before TMI), moderation of what had seem inexorable growth in electricity demand that increased risk, and sudden introduction of new competition (non-utility generation unleashed by PURPA) that soaked up what demand growth there was.
Well yes, I did mention cost (of course only if we exclude most externalities) which was the primary longterm factor.
But the mass psychosis after Three Mile Island and then Chernobyl made nuclear expansion very expensive politically and financially (of course not saying that all of the additional safety regulations which inflated costs were not necessary). It would be rather silly to claim that it didn’t tip the scales at all.
I'm not sure it's fair to call it mass psychosis. But the connection to nuclear bombs was very clear to the public by then and the public backlash was also tainted by that association. Unfortunately, in the 70s/80s the problem of global warming was not obviously going to be a problem, if on the radar at all. Unfortunate timing.
You're spinning a comforting story to allow you to evade unpleasant realities. I would have thought "everyone is crazy but me" would have set off red flags for you, but apparently not.
I’m sorry but you are the one doing the spinning. You keep putting words into my mouth and then ignoring everything else I said..
I repeated that cost (of course only if externalities, that hardly anyone was fully aware of back in the 80s, are excluded) was the primary
reason why nuclear failed. Do you really think that public opinion had no impact and did not accelerate it, though?
I've always found xkcd's radiation chart interesting: https://xkcd.com/radiation/ where it talks about coal vs nuclear plant at the beginning. Maybe his sources would help. ISTR that lignite would often be a bit radioactive and burning so much of it would spread it through leftover emissions ? If you find a proper source, I'm interested too.
I didn’t mean radiation specifically. Rather overall damage due to pollution compared to the casualties in Chernobyl and Fukushima.
e.g. NIH estimates 460k excess deaths (just in the US alone) between 1999 and 2020 were caused by emission from coal plants. And that’s the period when coal became relatively “clean”.
So it must have been much worse back in the 70s and 80s.
The highest estimate for Chernobyl is 60,000. But 16,000 is probably more realistic.
The irony is how consistently the environmentalists have been campaigning for things that left the environment worse by delaying any possible transition to nuclear power.
Renewables are already 30% of world energy generation and growing rapidly: a quick google says that 86% (!!!) of new capacity is renewable. This part of the climate puzzle is effectively already solved, and didn't need nuclear.
Transport fuel and combustion-based heavy industry are the hard nuts to crack, and reactors don't help there.
Yesterday, after wanting it for years, I finally found what looks like a good reference on what makes power plants so expensive. Most fusion-power proposals are based on using the fusion reactor to provide heat for a conventional thermal electrical power plant; https://www.eia.gov/analysis/studies/powerplants/capitalcost... gives a couple hundred pages of details about why those are expensive.
As I said in my comment at the time https://news.ycombinator.com/item?id=42850472, the cost of solar panels has dropped in recent years to the point where it's dramatically cheaper to generate electricity with solar than from a thermal power plant.
https://caseyhandmer.wordpress.com/2022/07/22/were-going-to-... is a proposal to generate solar (and wind?) power and use it to make synthetic 'natural gas' as the long term energy storage for when the wind is calm and the sunlight blocked. That can go into existing gas storage and pipelines, and be used to drive existing gas power stations.
We have long known how to my synthetic fuels. Germany with limited access to oil in WWII ran large parts of their war on synthetic fuels. South Africa during their embargo years (because of how bad they treated blacks) wasn't able to get [much] oil and so ran their economy on synthetic fuels. Even today many people use synthetic oil made with the same technology in their cars.
The problem has always been cost. It typically costs 5x more than conventional oil. Still oil is a much denser store of energy than batteries and so for many applications it might be worth using it despite the much higher costs.
Right! Long-distance aviation seems like an obvious application, though companies like Joby might take over the short-distance market with battery-powered airplanes.
Presumably most of the cost of synfuel has been in the energy inputs, right? And solar will drive the cost of intermittent energy down by at least an order of magnitude and possibly two.
As Handmer's post says:
> (...) up until this point no-one has considered scaling these up as a fundamental source of hydrocarbons, because doing so would be cost prohibitive. Why? The machinery is not particularly complex, but the energy demands are astronomical.
Yes, hydrocarbons have major advantages as a form of seasonal energy storage, and possibly you can burn them in cheaper, less efficient power plants. Round-trip efficiency is inevitably going to be pretty low with electrically generated fuels, but they still ought to be able to outcompete fossil fuels economically once production scales up.
Gas has the advantage that gas turbines are less inefficient than conventional steam, though ultra-supercritical steam (mentioned in the EIA paper) could conceivably take that away, and of course it's easier to ship around over short distances.
What we really need are low-capex thermal power plants with high efficiency so we can turn the gas or kerosene or whatever into electricity in the rare case that we need to, without losing money on having massive amounts of capital investment sitting around idle 95% of the time.
You've got to have the whole fuel production, storage and transportation system ready to go 100% of the time. Raises costs if you do it honestly, if you try to skimp it might not work when you need it.
It might be okay if production and transportation only work some of the time if you can store up enough fuel to ride out the outages. That's how it works today for fuel transportation; gas stations commonly go without deliveries for days or weeks at a time when roads are snowed out, for example, and oil tankers and LNG tankers make their deliveries even less frequently.
In particular, solar and wind won't produce any fuel at night when no wind is blowing, and may not produce enough fuel during the rainy season or winter.
Right now it is solar and wind electricity that is cheap. E-fuel [1] and other forms of chemical storage [2] are a thing but might not be so cheap. Seasonal storage might use methane and avoid the legendary boondoggle that is Fischer–Tropsch [3] but methane is itself a powerful greenhouse gas. A really good methane handling system loses much less but if you lose 1% of it you might as well be burning fossil fuels. Even hydrogen isn't completely benign [4] as it depletes those "negative ions" that were a fad in the 1970s and as a result prolongs the life of other greenhouse gases.
TCES is extremely cheap (megajoules per dollar of rechargeable capacity, compared to tens of kilojoules per dollar for batteries) and will likely be critical to the energy transition, but it won't run your servers overnight or your airplanes over the Atlantic. Things like cycle life are still an issue for TCES. And you need electric power to run your TCES system, so efficiency is still a concern.
I feel like oxidizing methane to methanol shouldn't be rocket science, especially if you have an unlimited amount of energy to use. NADPH does it at scale already. A few more cheap process steps would give you nontoxic ethanol instead.
From Wikipedia, the free encyclopedia
(Redirected from Hyperhurricane)
A hypercane is a hypothetical class of extreme tropical cyclone that could form if sea surface temperatures reached approximately 50 °C (122 °F), which is 12 °C (22 °F) warmer than the warmest ocean temperature ever recorded.[1] Such an increase could be caused by a large asteroid or comet impact, a large supervolcanic eruption, a large submarine flood basalt, or "incredible" global warming.[2] There is some speculation that a series of hypercanes resulting from the impact of a large asteroid or comet contributed to the demise of the non-avian dinosaurs.[3] The hypothesis was created by Kerry Emanuel of MIT, who also coined the term.[4][5][3]
Even if fusion reactors can be made reliable, they'll still require a huge amount of investment with a payoff decades into the future. Meanwhile, you buy some cheap solar panels and can get returns on your money almost immediately.
Yes. Laypeople often forget about this fact. A big reason that we're not building fission plants is because you won't get anything back for a decade or more. And by that time, will it still even be competitive?
I wish SMR gets more attentions and focus. The UK has more Wind resources to power all four nations and it is on course to double its current 30% electricity from Wind to 60% by ~2030. I dont see why it cant be 90%+ by 2040 barring policy, politics and human factors.
All we need then are SMRs for base loads, while we work on energy storage that could be battery or Dams.
I'm not an expert, but I actually feel like it's not terrestrial power generation where fusion would be useful. We have plenty of ways of generating power here, including a massive fusion reactor up in the sky which will continue operating for the foreseeable future :)
Instead, I think need to find some way of generating a lot of power over extended periods of time if we want to crack space travel beyond lobbing items into Earth orbit. Particularly for human space travel.
We need significantly more energy density than what chemical motors can provide, and nuclear is the easiest and most "within-reach" that we know of. Things like anti-matter and even more exotic forms of propulsion are pure sci-fi for the time being (in terms of practicality). I suppose fission could work in a very primitive way of basically detonating small nukes behind a pusher plate, but fusion feels so much more elegant (at least to a layman). Mostly in the sense of not having to use effectively the same nukes we stick in weapons. Something tells me that this will never fly (excuse the pun), for obvious reasons.
You are implying ion drives I assume. Ion Drives work by accelerating particles to _really_ fast speeds (like fractions of light-speed) and use that to generate thrust. Meaning it requires almost no "fuel" but requires A LOT of power for the amount of thrust generated.
A fusion power generator coupled with ion drives could potentially generate a lot of thrust for little mass since the fusion fuel also has very little mass.
He might be talking about a "direct fusion drive," which is not an ion drive. It's basically a magnetic-confinement device but you leave part open in such a way that fusion products exit to produce thrust.
In fact I meant it in a general sense, and both ion drive and a more classic confinement engine like design that expels the fusion products. I don't know enough about the intricacies to argue for one vs the other, but fusion fuel is the most energy dense fuel we have within reach, by a huge margin. The variable then becomes the mass of the engine (and supporting plumbing), but we have control over this.
Can we shrink a fusion reactor small enough to make the rocket as a whole feasible? I don't know. My hope is that for space travel, we can make certain tradeoffs that are not feasible on Earth - e.g. expelling radioactive waste products into the environment. A rocket engine doesn't need to be all that efficient at generating electricity either, it mostly needs to be efficient for propulsion, which might make it's design much simpler.
It's difficult to see how DT fusion could be better in space than alternatives. DT fusion reactors will be extremely large and complex. They will be much heavier than and much less reliable than alternatives. That's not good for space applications.
The high energy density of the fuel is irrelevant if the reactor is very massive and if only a tiny fraction of the reactor's mass in fuel can be fused over the lifespan of the reactor. IIRC, it would take something like 700,000 years for ITER to fuse its own mass in fuel, if it could be operated continuously forever (which it can't, of course.)
You are correct, but I wonder if space travel applications allow us to take certain liberties with the reactor design that are not feasible down on Earth. E.g. a simple(r) confinement design that expels the fusion products could work well in space, but is a non starter on Earth for obvious reasons.
I'm not sure how much of the design on Earth is bulky and cumbersome because it needs to be vastly safer and more complicated than what a rocket engine needs to be. Taken to the extreme, nuclear propulsion in space can be very dumb: just detonate nukes some distance behind the rocket and ride the shockwave with a pusher plate. It's a radically different design than anything we'd attempt for terrestrial applications.
Near-earth space has an abundance of solar energy convertible to electrical energy. Deep space is an interesting potential use case for fusion, but fission-powered electric space propulsion via RTGs has the virtue of having demonstrated nearly half a century of continuous operation
(and after it leaves the earth's atmosphere, radioactivity ceases to be a concern)
As far as I know there has never been an RTG powered electric rocket engine. RTGs are extremely expensive per unit of power output. Spacecraft like that ion engine propelled one that was sent to Ceres use solar power.
Voyager and Pioneer used RTGs, as well as some more recent longer range missions and the planned Dragonfly to Titan. Agree that solar has much better unit economics, but I don't see any reason why the same wouldn't be true of fusion too.
And none of those use RTGs to power electric rocket engines.
(EDIT: thrusters are rocket engines, just very small ones. All those missions used small chemical rockets for maneuvering, using storable propellants like hydrazine.)
Edit: I stand corrected on Voyager/Pioneer exclusively using hydrazine, though you could get similar thrust from a modern HET without needing insane amounts of RTG power, and run other electrical equipment off that extra power when you've reached target velocity and aren't firing the thrusters for attitude control. The higher Isp of the EP system is an advantage too...
I dont want to undermine your enthusiasm.. But unfortunately, even fussion based energy generation looks like sci-fi. There is reason why certain things happen in objects of certain mass. You cant cheat on physics.
The neutron corrosion alone is very serious problem that I doubt can be fixed. There is only one material that can withstand it, its called neutronium and we all know why we cant even use it on earth, not to mention about producing one. The Li-7 trick they are currently trying to use is not convicing me. Whats the point of it if you need to stop reactor every few years to replace entire Li-7 shielding. Lithium itself aint cheap and we basically will destroy it in such reactors (non-recycleable).
The hope lies in Fission Reactors, breeder one. Unfortunately, there is big tabu here because those can be used for enrichment, and so, for nuclear weapons.
The materials question is interesting to me because my Masters is mostly related to Material Science. My thesis advisor did some research in this area--mostly with ZR3AL as I recall. My thesis was with a related material (NiAl) for potential uses related to turbines. The aluminide alloys never really worked out for a variety of reasons. My advisor ended up pretty much doing water ice-related research full-time (which he had just started doing when I was a grad student) and that's pretty much what he spent his career doing. The US Army has? a lab in Hanover NH and, at the time, they were very interested in better understanding the properties of ice, in part, presumably because of the possibility of a land war with the Soviet Union in Alaska.
> The neutron corrosion alone is very serious problem that I doubt can be fixed.
It can be fixed, because some fuels don't produce neutrons. But aneutronic fusion is like two orders of magnitude more difficult than Deuterium-Tritium. Still, it might be necessary for any reasonable application. One of my professors was a big proponent in aneutronic fusion even for Tokamaks.
The significant issue is that DT fusion reactors are very large compared to fission reactors. If we look at gross nuclear power output per unit volume, comparing a fusion reactor to a PWR primary reactor vessel, the comparison is stark. ITER, for example, has a power density 400x worse than a PWR; the 2014 ARC design is 40x worse.
This large size has devastating implications. The nuclear island of a fission power plant is maybe 12% of the cost of the plant. Increase that cost by an order of magnitude and you've doubled the cost of the plant. Commercial nuclear power is already not competitive; double the capex and it becomes ludicrously uncompetitive.
If someone presents you a fusion proposal, the first thing you should ask is "what's the volumetric power density of your reactor?"
If you look at the arguments purporting to show DT fusion could have a chance they do things like assume they can build everything for 2x the cost of materials. Yeah, good luck with that, especially when that cost estimation technique would give wildly low results if you applied it to fission power plants, which are much less complex.
As I see it, there's only a couple of ways fusion might make it. The first is an approach that makes minimizing reactor size the primary goal. This would involve very small plasma configurations and one where all plasma-facing surfaces are covered with thick layers of flowing liquid lithium so areal power density can be very high, at least an order of magnitude higher than most concepts. Among DT efforts, only Zap looks like it might have a chance here (but there are parts on one end of the reactor that are still exposed to neutrons, unshielded.) The experience with liquid sodium in fission reactors should give one pause about making this work, though.
The other approach would be to avoid non-nuclear parts than fission power plants have by exploiting direct conversion of plasma energy to work, so the apples-to-apples comparison of heat sources does not apply. Helion is the front runner here, using more advanced fuels that put most of their energy into the plasma, not into neutrons. I consider Helion the least dubious of all current fusion efforts.
Also relevant: The Trouble With Fusion [1]. It's from 1983, but could well have been written today, as nothing has really changed in the grand scheme of things. In short, there are multiple technical and economical challenges that stand in the way of nuclear fusion becoming a viable path towards clean energy, even after the first successful demonstration of a self-sustained burning plasma. Which, by the way, still seems about 10 years out, with the latest round of delays on ITER. I think most of the people involved understand this, but there is ongoing political will to sink billions of dollars into these projects, so here we are.
Lidsky is long dead (as are Pfirsch and Schmitter, who had similar critiques in Germany), but I like to think he'd have been a supporter of Helion. Helion's approach is close to what Lidsky advocated (aneutronic fusion), at least before Lidsky's student Todd Rider showed truly aneutronic fusion was not likely to work. However, the Helion approach has greatly reduced neutronicity, particularly of energetic DT neutrons, and exploits the kind of non-Maxwellian plasmas w. energy recovery/recirculation that Rider was analyzing.
All well and good, but Commonwealth Fusion and Tokamak Energy are building large test reactors are we speak in hopes of answering these exact questions. These aren't science projects, but tech demonstrators designed as a stepping stone to commercial fusion. The points raised in the article are valid, but can only be tested by building reactors, which is currently happening in the above companies.
Agreed, but the problem isn't about funding fusion research - which isn't a bad thing IMO - more that the UK government seem to believe that it's "coming soon". The concern is that they're basing future energy policy on a technology that may not be a practical power source for generations (or ever).
See also their recent commitment to using AI. Funding AI research and supporting AI startups: sure. Basing policy on the fact that AI can replace civil servants and other workers: there should probably be some evidence of that first.
Worth noting that the author of this letter works at Culham Centre for Fusion Energy, where JET is located. 'Abingdon' was a clue but they could have filled in the detail.
There’s good old #4 again: fusion processes inevitably create extreme neutron fluxes capable of turning everything they touch into radioactive waste. Meaning the reactor vessel and all involved fluids and support structures for handling thereof.
Congrats! All the hassle of fission with more steps. Proceed to scream at me!
Yeah, that's something I've been wondering about for a while. Sure you can get Fusion to create energy eventually, but the engineering to keep the reactor running long enough to break even on cash sounds even more difficult. Anything involving hot plasma and large amounts of ionizing radiation is going to play hell with your equipment. That's not even considering the secondary nuclear waste created by the neutron, proton, and alpha radiation.
It just seems like a great way to turn a billion dollars into a pile of extremely hot, radioactive, radiation embrittled metal.
There is also fusion with low neutron flux [1] (among the others: Helium 3) but it is even more difficult to achieve. I'm not confident that any of us will ever see a commercial fusion reactor. I'm afraid that all we can do to harness fusion energy directly is by the means of solar panels.
There are a few things I don't get of the fifth point: why do we need to be able to operate nuclear fusion plants remotely? Wouldn't it be not just possible, but maybe even preferable, to have staff on site to manage the plant, like it is the case for fission plants? And if we still want remote control, why is this such a big challenge? With a background in computer engineering and cybersecurity, this seems to me the smallest challenge of the list.
I think they mean that we shouldn't have to send a guy inside the reactor to clean and reset it up every time you fire it. I don't really know why they listed this point.
Seeing it from this perspective, maybe I now see it: the residual radioactivity inside the reactor may be strong enough to mess with the electronics of a robot you're sending inside. Maybe there are challenges in getting enough shielding while keeping the robot reasonably sized and maneuverable.
A major challenge is getting the robot into the place you need it, along with other objects it needs like replacement parts, in between the other infrastructure like coils and cooling blankets. You mustn't damage the delicate reactor structures, you can't have rails and so on always in the reactor vessel, the vessel has to be vacuum tight, and most of all you must not allow the robot to fail and be unextractable, because humans can't go in an pull it out.
ITER seems to have a project called the Agile Robot Transporter which you can see some renders of that demonstrate how fiddly this problem is.
Remote handling might refer to robotics or remote control done on site. Fusion does produce neutrons that will generate some radio activity in structures and so on. Minimising humans needing to be close these is reasonable move.
It's more to the point that disruptions tend to be very disruptive, too frequent, not well understood, and the reset takes a really long time. But you're right that this does not quite belong. If we solve the technological problems, the maintenance should resolve itself.
is it remote from the reaction chamber? i.e. if people have to go in to wipe/re-coat stuff, it takes longer to do it safely, but it's cheaper/faster if you can get a machine to do it with the people controlling it from a distance. still, doesn't seem terribly hard, when surgery has been done with remote controlled robots.
Yes, but just throwing in the possibility of a future unexpected breakthrough without grouding this in the situation so far, can lead to wishful thinking. The fact that the list has 5 independent points means that you need 5 independent breakthroughs, making it even less likely to happen in the short term.
For a DeepSeek moment, the west would first have to produce a functional, but Wildly expensive fusion reactor. Then someone else could ask the reactor questions over and over until they understand well enough how the reactor works.
The DeepSeek moment for nuclear power might be accelerator-driven subcritical reactors that make fission plants much simpler and safer (and therefore cheaper, the key element of DeepSeek moment) by never having a critical mass of fuel in the reactor - criticality is only possible when a particle accelerator is illuminating the core, and the accelerator can just be turned off.
I agree with all this. People make two big mistakes when evaluating fusion:
1. The fuel is free to cheap, depending on the fuel; and
2. Stars can do it so surely we can.
So let's consider a mythical plant that costs $30B to build, has a lifetime of 30 years, costs $3B/year in maintenance and produces 1000MW of power. And the fuel. is free. Is the electricty free? Not even close. That's ~8.8GWh per year. No plant runs at full capacity 24x7 so let's call it 4TWh. That's about $1/kWh. That's ludicrously uneconomical. Of course you can throw in vastly different numbers and vastly different results but the main point is that fuel cost is a non-issue.
And as the article points out, in a D-T reactor, you still need to produce tritium because it's basically not naturally occurring in a way that can be gathered. That means a fission breeder reactor. That fuel isn't free.
Second, "the Sun does it" misses the point. As the article correctly points out, damage from neutrons ("neutron embrittlement") is a huge problem. Beyond that, it's a huge problem because every high-speed electron that escapes the plasma is energy loss from the system.
tl;dr: A British politician suggested we are "within grasping distance" of using fusion for wide-scale power generation; the letter published in the Guardian explains how significant challenges remain both with the core process and surrounding considerations. (i.e. letter explains why we are "30 years away" at least, not 3 years away)
Nuclear fusion has been 3 decades away since as far as I can remember (I believe I first read about it in National Geographic in the late 70s or early 80s)
China also has an order of magnitude more land available, and a fascist government. Arguing over squeezing a new runway into a built-up area is a good thing.
The hard part here is twofold: there are several orders of magnitude needed to be overcome (in efficiency) for it to work. Also the learning curve in fusion is not a curve, but more like a staircase with very tall steps
ITER is gearing up to be a great disappointment
I really don't feel like magnetic confinement is the way to go, unless we get some new superconducting technology and need to create electromagnets with 100x their current strength
We actually do have commercially available superconductors that support much stronger magnetic fields than what ITER is able to do.
I don't think it's 100x, but it doesn't have to be. Tokamak output scales with the fourth power of magnetic field strength. Double the field, 16X the energy output. CFS and Tokamak Energy are building tokamaks with these.
Tokamak output also scales with the square of reactor size, which is why ITER is so big. These new reactors can be much smaller for the same output, so they'll be much faster and cheaper to build. The CFS design is about the size of JET, which was built in four years, with three of those being just for the building.
It can't be 100x. In designs like ARC the magnets are up against limits not from the superconductors, but from the structures needed to resist JxB forces. The majority of the mass of the ARC reactor is in these steel structures. Much progress beyond that would involve significant increases in strength of materials, and even then the strength needed scales as B^2.
100x ARC would be ~2000 T. The magnetic pressure of such a field would be about 100x the detonation pressure of TNT. Experimentally, fields of such magnitude could only be produced very briefly in devices that near-instantly explode (for example, devices driven by intense laser pulses that produce large charge separations.)
> Tokamak output also scales with the square of reactor size
ITER is big because it wouldn't work at all if it were smaller (at the magnetic fields involved). Output scaling as (linear dimensions)^2 means you want your reactor to be as small as possible, to maximize power/volume. Square-cube law in operation. The higher magnetic field in HTSC tokamaks allows then to be smaller. Volumetric power density is still poor, though, just not as terrible as ITER.
I don't see any scientists lying mentioned in the article, where did you get that takeaway? We see policymakers ignore competent advice on all fronts all the time, I wouldn't be surprised if that's the case here too.
They listed 5 challenges but they forgot probably the biggest one-- Make it cheaper than the alternatives. That might never happen even if we do solve all the material and technical problems.
Disclaimer-- I did my masters on fusion reactor simulations.
This, it's mostly an economic challenge. We need clean energy that is cheap. Not clean energy that is more expensive.
It's important for clean energy to be cheap because that creates an economic incentive for people to invest in switching.
The good news with renewable energy and battery storage is that we're basically there. Texas which produces lots of oil also produces lots of clean energy. Because oil refineries require a lot of energy to refine their fuel. And of course using your own product at a premium when clean renewables do the same job would be madness. Think about it, the very companies that are expecting others to buy and burn their fossil fuel are investing (heavily) in renewable energy because that's cheaper than using their own cheap supply of oil and gas for the same purpose. It's more lucrative to sell that than to use it for them.
That's why records are being broken with amounts of solar, wind, and battery coming online on a regular basis. It's cheaper. So, people do more of it as fast as they can scrape the funding together to do it.
Fusion if it happens at all is probably going to be really expensive for quite some time. Nuclear has the same problem. Clean but just super expensive. And slow to plan and execute. You need to decide decades in advance that you need it in order to have it ready. Most reactors coming online this year (hint: not a lot) were planned when solar and wind were still expensive (1-2 decades ago). In other words, they probably did not take into account current price levels of new clean energy. In terms of profitability these things are kind of dead on arrival. Fusion faces the exact same challenges. It's not completely hopeless. But we need 1-2 orders of magnitude reduction of cost per energy generated this way.
"Clean" is arguable for fusion.
A fusion reactor is going to create and consume large quantities of tritium, which is difficult to control because it is an isotope of hydrogen and hydrogen can go right through metals. It might not really be a problem but it is always going to be in the paper that somebody detected tritium around a reactor.
Also, because fusion reactors produce large quantities of neutrons, you could put U238 or Th232 into the blanket and breed either Pu239 or U233. Back in the 1980s some engineers thought that a hybrid fusion-fission system could have better economics that either alone if you used the bred fuel in thermal reactors. Those materials could just as well be used to make bombs, although, unlike a fast breeder cycle, a fusion reactor doesn't require plutonium processing technology in order to work.
---
As for solar and wind being "cheap" don't believe the hype. I mean, it is cheap if you don't mind that it is only available when the sun shines and when the wind blows. You see estimates for solar + storage that are deceptive and thought-stopping like Case 17 of [1] (an hour and a half of storage is significant, but it is not comparable to the 24 hour service that fossil fuels and nuclear can provide.)
I haven't seen a detailed analysis of the generation + transmission + storage costs of a 100% renewable system published in the last decade. If you try a back of the envelope calculation it usually winds up worse than Case 9 from that report, except that Case 9 is optimistic because (so far) it always costs a lot more than you estimate to build a nuclear plant. [2]
[1] https://www.eia.gov/analysis/studies/powerplants/capitalcost...
[2] to get some sense of how bad the problem is you need: (a) to get through the night (12 hours of storage? still might beat AP1000...) and (b) get through extended periods of bad weather (2 weeks of storage?) and (c) get through the winter (overbuild solar 2-3x? otherwise 6 months of storage?)
Here's a case study for Denmark:
https://www.sciencedirect.com/science/article/pii/S030626192...
About the worst you can find in terms of small geographical area and having to deal with both Dunkelflautes and arctic conditions spilling south which often leads to -5C to -15C and near zero wind.
Of course excluding that storage is plummeting in price with the latest Chinese auctions with bids submitted December 2024 landing on $62/kWh installed and serviced for 20 years.
https://www.ess-news.com/2025/01/15/chinas-cgn-new-energy-an...
> I haven't seen a detailed analysis of the generation + transmission + storage costs of a 100% renewable system published in the last decade.
Australia updates their GenCost every year, which feeds into their Integrated System Plan. The draft of this year's is up for public comments:
https://www.csiro.au/en/research/technology-space/energy/gen...
> In this report, where we make a comparison between the costs of variable renewables such as solar PV and wind and the costs of other technologies we include the cost of firming those renewables which we call integration costs. These are the additional costs of ensuring supply is reliable when using intermittent energy sources. These integration costs are itemised in the report and include storage, transmission, system security and spilled energy
Handling the general problem is made much easier with e-fuels to handle very long term storage and rare outages. The cost can be estimated and isn't that high, even if the round trip efficiency of storage with e-fuels is not very good.
BTW, the battery cost estimates there in that link seem very high, not reflecting the recent collapse in LFP prices from China.
> This, it's mostly an economic challenge. We need clean energy that is cheap. Not clean energy that is more expensive.
Cheap and expensive in dollar terms is meaningless. Dollars are numbers in computers. Joules and watts are real.
What I can agree with we need good enough EROI energy and then adjust the policies regarding fusion, nuclear and fossil joules, watts and their related numbers in computers.
You are right with the policy shift. But I would suggest to stop throwing good money after bad money instead until the sector massively improves its accuracy of numbers it does publish on such things like budgets and timelines for new projects.
Because most of the numbers coming out of the nuclear sector for this are too good to be true. To put it mildly. It's tempting to use more harsh language here. Like "fabricated nonsense" or "a scam". And some people do.
Probably the biggest challenge for nuclear proponents aside from demonstrably high cost, overruns, wildly optimistic timelines, etc. for past projects is the credibility of their predictions about cost for future ones. The lower those predictions, the smaller the chance that those are anything more than wishful thinking mixed with a lot of hopium and based on very flawed assumptions.
When cost overruns in nuclear projects are routinely measured in hundreds of percents, something is very wrong. It would be nice to get that down to two digit percentages. That would still be terrible but a lot better than the industry norms for this. Especially compared to renewable projects.
This credibility challenge is a large part of the reluctance of investors to fund new efforts in this space. Most of the projects that actually still happen feature governments and tax payers taking most of the risk. Which makes it more likely that the huge budget gaps get funded rather than resulting in investors cutting their losses and bailing.
If you want numbers, look at numbers of GW coming online for nuclear, wind, solar, and energy storage projects. There's a stark difference there. Renewables are outpacing nuclear at such a pace that you could legitimately wonder why to bother with such projects at all.
> Renewables are outpacing nuclear at such a pace that you could legitimately wonder why to bother with such projects at all.
Indeed, especially as renewable will more and more eat into nuclear 'baseload', lowering its load factor and therefore raising its total production cost. https://www.youtube.com/watch?v=udJJ7n_Ryjg
Fully agreed. It took us from 1944 to 1965 in fission to go from "ok it makes net energy" to "ok we can make electricity on the commercial market within the realm of the existing alternatives". Fusion is still pre-1944 in that they haven't even demonstrated getting net energy out, let alone put together an electricity-generating machine. I have no doubt at all that we'll make fusion-derived electricity in my lifetime. But I have no idea whether or not we'll make it cheap.
They say the fuel is cheap. It's not (tritium and He-3 are hard to come by), but even if it were, keep in mind that a fleet of uranium-fueled fission reactors like the one in France spends about 5-10% of its costs on fuel.
Breeding is a counter-example. Fermi envisioned it in the early 1940's.
Lab breeder reactors emerged during the 1950's and were satisfying in the 1960's.
Big projects aiming at obtaining an industrial reactor were aplenty, worldwide, during the 1970's through the 1990's.
They all failed, and remaining efforts are quite sparse. https://en.wikipedia.org/wiki/Breeder_reactor#Notable_reacto...
Obtaining something adequate (raw size, reliability, safety, ease of use...) isn't always easy.
The quest of an industrial nuclear fusion reactor may tank this way.
"Within the realm" involves some externalities though. Fission reactors have never been particularly cost competitive with other electrical generation technologies. They were subsidized heavily because people wanted reactors to build bombs, and more broadly to cultivate a technology base of people who could build and maintain reactors to build bombs. Once we had enough bombs, that subsidy got rolled back and fission fell off the leaderboard.
If you spend as much time reading the early 1950s documentation of the nuclear industry, you'll quickly see that commercial nuclear reactor development was very much decoupled from the graphite or heavy-water moderated bomb-making reactors. They separated around 1953 and never came back together. Back then, the fledgling industry was wondering if they could sell excess plutonium for bombs, but the government said that they were not interested in buying it. They said they had more than enough coming from their own specialized bomb-making reactors.
Certainly nuclear tech was subsidized originally by the Manhattan project, but very soon after, bomb making and nuclear power went their separate ways.
You could much more strongly argue that the nuclear reactor development made to power submarines helped develop the commercial nuclear power industry. This checks out much more cleanly, as Shippingport was led by Rickover himself.
That is a bit revisionist. Most of the investment went towards reactor technology and research that had a dual purpose of generating power but also being aligned with generating plutonium. That's why the type of reactors used in nuclear sub marines was never commercialized.
Simply too expensive for civilian use and they never had a dual use that benefited generating material for nuclear bombs. Commercial reactors today are based on minor variations of nuclear fission that was also strategically relevant for getting plutonium. Different reactors but same mineral supply chains and a lot of overlap in R&D. As soon as the cold war ran to an end, funding for all that fell off a cliff. Including building new nuclear power plants. Coal and gas plants were vastly cheaper to build. And they still are. And back then, nobody cared about pollution.
Only now that people are interested in small reactors are these decades old fission reactors getting more attention.
The PWR was and is used in submarines. It led to Shippingport which led to Yankee Rowe and Indian Point 1, which launched the commercial nuclear power industry.
Conversely the graphite moderated Hanford reactors that made plutonium never led to commercial reactors. Closest you got was Hanford N but that was kind of a one-off.
If it matters, I'm a professional nuclear engineer with a lifelong interest and expertise in reactor history.
The PWR (and related BWR) are bad at weapons plutonium production for two reasons: (1) there aren't a lot of excess neutrons to transform U238 -> Pu239, graphite reactors can be tuned up to make about twice as much Pu. (2) Either kind of LWR runs under pressure and it would be a big deal to shut the reactor down, bleed the pressure, and change the fuel; most other reactor types can be refueled online. The longer Pu239 stays in the reactor, the more of it adsorbs another neutron and becomes Pu240 which is bad for nuclear weapons because it creates a neutron background that causes predetonation.
Graphite reactors have a history for power production: see [1] and [2]. They might even have a future [3] but the German experience makes me think the prismatic type core makes more sense [4]
[1] https://en.wikipedia.org/wiki/Advanced_gas-cooled_reactor
[2] https://en.wikipedia.org/wiki/RBMK
[3] https://www.world-nuclear-news.org/Articles/Chinese-HTR-PM-D...
[4] https://inis.iaea.org/records/81760-q0060
The PWR used for power [1] is to first order "the same type of reactor used in submarines" in the sense that it is a thermal (in terms of neutron spectrum) reactor which is cooled by pressurized water.
The submarine reactor uses highly enriched uranium and, I believe, burns like a cigarette, which makes it easy to start and stop the reactor without worrying about [2]. It doesn't produce a lot of Plutonium because it doesn't have much U238 in it. I believe the fuel elements are horizontally oriented. It's built tough because it might get knocked around in a war.
The commercial reactor has vertically oriented fuel elements and tries to burn the fuel uniformly along the axis because this minimizes cost, because minimizing cost is a matter of maximizing power for a certain sized reactor core, and maximizing power is a matter of maximizing heat transfer out of the interior of the fuel rod and to the coolant, so you want to be producing and removing heat from the whole volume of the core. Temporal uniformity helps with spatial uniformity so the reactor is not so good at starting and stopping, but it can follow loads [3]
Notably the west (where natural gas is available) quit building coal burning power plants at the same time it quit building nuclear power plants, partially because of pollution [4], partially because a 100% steam turbine powerset as used in coal and nuclear plants can't compete with a gas-fired combined cycle powerplant which combines a very-low capital cost gas turbine with a steam turbine that is about half the size (and cost)
The small LWR is a pipe dream because LWR economics improve as the reactor gets bigger. One reason why we can't build reactors on an N-th of a kind basis is that every time we build one we decide the economics weren't that good and they might get better if we scale the design up. Had NuScale ever built a plant they probably would have tried scaling it up too.
It is claimed small reactors could be easier to build but until somebody actually builds one, this claim is hard to take seriously.
One path that makes sense is to pick an optimized design for a large reactor and "get good" at building it, it could be an advanced Western design like the AP1000 or something like the Chinese Hualong One which is an improved version of the reactors that France built in the glory days. The Russians are really good at building the VVER, their version of the PWR.
The other path is to build a reactor that works at much higher temperatures such as the fast breeder reactor, high temperature gas cooled reactor, or molten salt reactor and couple it to a gas turbine powerset or use the heat to produce hydrogen directly. [5] This is still decades away.
[1] Pressurized Water Reactor, there is also a Boiling Water Reactor in which the water boils in the core but it's not a vast difference: these are both called LWRs (Light Water Reactor)
[2] https://en.wikipedia.org/wiki/Xenon-135
[3] https://www.oecd-nea.org/upload/docs/application/pdf/2021-12...
[4] https://en.wikipedia.org/wiki/Acid_rain (SO2 from coal burning plants, ironically, masked the warming effects of CO2 until we cleaned up coal plants)
[5] https://en.wikipedia.org/wiki/Generation_IV_reactor
Nuclear power has always been heavily subsidized by the federal government. I'm not making a narrow point about bomb technology per se, I'm pointing out that the industry only exists because the government wanted it to exist. And that the government wanted it to exist because making bombs (and "nuclear" stuff more generally) was perceived to be an issue of national security. And once that perception shifted, the subsidies shifted and fission couldn't compete.
Your response amounts to "the government was mistakenly subsidizing civilian nuclear power". Which might be true, but doesn't refute the fact that they were.
There have been plenty of subsidies for solar.
The government spent money researching it early on, such as for the space program. Then the government has outright subsidized solar installations on roofs, etc. Part of the solar miracle of the last 15 years have been huge demand-side subsidies by China because they want to corner the market.
So, sure, and you'd absolutely take those into consideration when deciding whether solar is cost-competetive, right?
That said, recognize that there's an important difference here. Virtually all solar subsidies are on the demand side of the equation. They pay homeowners to buy solar, and act to increase the size of the market. They don't make solar cheaper to produce (except via second order effects due to the resulting scale).
Nuclear wasn't like that in the 60's and 70's. The government was handing the checks to the reactor builders, not the consumers.
China subsidizes supply, the US subsidizes demand. Communism in China and Russia was always about out-capitalizing the capitalists.
You keep moving goalposts. The US nuclear industry in its early decades was absolutely subsidized at the supply side, making reactors cheaper and creating the illusion that the technology was cost-competitive with fossil fuel generators, when it never was and almost certainly never will be.
Solar doesn't have that property. Solar is in most areas reasonably competitive already, and the subsidies work towards increasing the size of its market.
> illusion that the technology was cost-competitive with fossil fuel generators
The hope, then, was to quickly obtain an industrial breeder reactor in order to considerably reduce (or even suppress any) dependency towards uranium providers (there is no known similar trick for oil/methane).
It failed, and it may be a major cause of the current nuclear demise.
It was also as a way to make the enormous expenditure on nuclear weapons and (even more so) nuclear delivery systems seem more acceptable, because it would have this putative valuable civilian spinoff. The space program had a similar motivation.
I'd note that the externalities may change that calculus a bit. The relative pollution from coal/gas/oil are huge but the cost isn't reflected in the price at the meter.
From first principles, ignoring safety (which was pretty lacklustre in the beginning) fission should be cheap though.
To be fair people were fine with multiple Chernobyl’s amount of damage every years from burning coal.
And other disasters like Three Miles Island very insignificant.
> multiple Chernobyl’s amount of damage every years from burning coal.
Do you have a source for that? I'd be very interested in a study that quantifies the damage of coal plants in terms of famous nuclear disasters.
Fly-ash contains radionuclides.
However, 99% of fly ash is captured in electrostatic precipitators and filters. Calculations that include the radionuclides in the captured ash are not being honest.
Yes now, but not back when fission plants were first brought online.
But we can ask: if this is the important issue justifying nuclear, which is more cost effective: adding pollution control to coal plants, or building nuclear plants?
If it's not the important issue justifying nuclear over coal, why bring it up?
Because my original comment was about the 80s?
Nuclear was the only viable green option back then. It failed because of mass psychosis and because it was more expensive than coal and gas.
Of course it’s too late now. But had nuclear plant construction continued to grow at the same pace as in the 60s and 70s CO2 emissions from power generation would have be been solved by the 2000s.
> adding pollution control to coal plants
Even if it was 100% effective that would do nothing to slow down climate change.
The "mass psychosis" theory is comforting to the zealot, but not in accordance with the facts. Nuclear failed for economic reasons. In the US, these reasons included: cost escalation (even before TMI), moderation of what had seem inexorable growth in electricity demand that increased risk, and sudden introduction of new competition (non-utility generation unleashed by PURPA) that soaked up what demand growth there was.
Well yes, I did mention cost (of course only if we exclude most externalities) which was the primary longterm factor.
But the mass psychosis after Three Mile Island and then Chernobyl made nuclear expansion very expensive politically and financially (of course not saying that all of the additional safety regulations which inflated costs were not necessary). It would be rather silly to claim that it didn’t tip the scales at all.
I'm not sure it's fair to call it mass psychosis. But the connection to nuclear bombs was very clear to the public by then and the public backlash was also tainted by that association. Unfortunately, in the 70s/80s the problem of global warming was not obviously going to be a problem, if on the radar at all. Unfortunate timing.
No, nuclear was on the ropes even before that.
You're spinning a comforting story to allow you to evade unpleasant realities. I would have thought "everyone is crazy but me" would have set off red flags for you, but apparently not.
I’m sorry but you are the one doing the spinning. You keep putting words into my mouth and then ignoring everything else I said..
I repeated that cost (of course only if externalities, that hardly anyone was fully aware of back in the 80s, are excluded) was the primary reason why nuclear failed. Do you really think that public opinion had no impact and did not accelerate it, though?
> "everyone is crazy but me"
Is that how you feel? Must be interesting..
I've always found xkcd's radiation chart interesting: https://xkcd.com/radiation/ where it talks about coal vs nuclear plant at the beginning. Maybe his sources would help. ISTR that lignite would often be a bit radioactive and burning so much of it would spread it through leftover emissions ? If you find a proper source, I'm interested too.
I didn’t mean radiation specifically. Rather overall damage due to pollution compared to the casualties in Chernobyl and Fukushima.
e.g. NIH estimates 460k excess deaths (just in the US alone) between 1999 and 2020 were caused by emission from coal plants. And that’s the period when coal became relatively “clean”.
So it must have been much worse back in the 70s and 80s.
The highest estimate for Chernobyl is 60,000. But 16,000 is probably more realistic.
> The highest estimate for Chernobyl is 60,000
Nope: https://en.wikipedia.org/wiki/Chernobyl:_Consequences_of_the...
Obviously I implied [mainstream/non-fringe] estimate.
Had that continued we’d be halfway to solving climate change by now, though.
Of course “environmentalists” and most others back in those days preferred cheaper coal and gas.
The irony is how consistently the environmentalists have been campaigning for things that left the environment worse by delaying any possible transition to nuclear power.
Renewables are already 30% of world energy generation and growing rapidly: a quick google says that 86% (!!!) of new capacity is renewable. This part of the climate puzzle is effectively already solved, and didn't need nuclear.
Transport fuel and combustion-based heavy industry are the hard nuts to crack, and reactors don't help there.
We’re are just late by several decades…
Yesterday, after wanting it for years, I finally found what looks like a good reference on what makes power plants so expensive. Most fusion-power proposals are based on using the fusion reactor to provide heat for a conventional thermal electrical power plant; https://www.eia.gov/analysis/studies/powerplants/capitalcost... gives a couple hundred pages of details about why those are expensive.
As I said in my comment at the time https://news.ycombinator.com/item?id=42850472, the cost of solar panels has dropped in recent years to the point where it's dramatically cheaper to generate electricity with solar than from a thermal power plant.
https://caseyhandmer.wordpress.com/2022/07/22/were-going-to-... is a proposal to generate solar (and wind?) power and use it to make synthetic 'natural gas' as the long term energy storage for when the wind is calm and the sunlight blocked. That can go into existing gas storage and pipelines, and be used to drive existing gas power stations.
We have long known how to my synthetic fuels. Germany with limited access to oil in WWII ran large parts of their war on synthetic fuels. South Africa during their embargo years (because of how bad they treated blacks) wasn't able to get [much] oil and so ran their economy on synthetic fuels. Even today many people use synthetic oil made with the same technology in their cars.
The problem has always been cost. It typically costs 5x more than conventional oil. Still oil is a much denser store of energy than batteries and so for many applications it might be worth using it despite the much higher costs.
Right! Long-distance aviation seems like an obvious application, though companies like Joby might take over the short-distance market with battery-powered airplanes.
Presumably most of the cost of synfuel has been in the energy inputs, right? And solar will drive the cost of intermittent energy down by at least an order of magnitude and possibly two.
As Handmer's post says:
> (...) up until this point no-one has considered scaling these up as a fundamental source of hydrocarbons, because doing so would be cost prohibitive. Why? The machinery is not particularly complex, but the energy demands are astronomical.
The WW2 and SA synthetic fuels were not very green. They were made by processing coal.
Or methane, but that isn't really that important.
The article is all about the cost.
Yes, hydrocarbons have major advantages as a form of seasonal energy storage, and possibly you can burn them in cheaper, less efficient power plants. Round-trip efficiency is inevitably going to be pretty low with electrically generated fuels, but they still ought to be able to outcompete fossil fuels economically once production scales up.
Gas has the advantage that gas turbines are less inefficient than conventional steam, though ultra-supercritical steam (mentioned in the EIA paper) could conceivably take that away, and of course it's easier to ship around over short distances.
Hydrocarbons are great for energy density, but gas turbines are brilliant for power density. That's why we need them (and syn fuels)
What we really need are low-capex thermal power plants with high efficiency so we can turn the gas or kerosene or whatever into electricity in the rare case that we need to, without losing money on having massive amounts of capital investment sitting around idle 95% of the time.
You've got to have the whole fuel production, storage and transportation system ready to go 100% of the time. Raises costs if you do it honestly, if you try to skimp it might not work when you need it.
It might be okay if production and transportation only work some of the time if you can store up enough fuel to ride out the outages. That's how it works today for fuel transportation; gas stations commonly go without deliveries for days or weeks at a time when roads are snowed out, for example, and oil tankers and LNG tankers make their deliveries even less frequently.
In particular, solar and wind won't produce any fuel at night when no wind is blowing, and may not produce enough fuel during the rainy season or winter.
Right now it is solar and wind electricity that is cheap. E-fuel [1] and other forms of chemical storage [2] are a thing but might not be so cheap. Seasonal storage might use methane and avoid the legendary boondoggle that is Fischer–Tropsch [3] but methane is itself a powerful greenhouse gas. A really good methane handling system loses much less but if you lose 1% of it you might as well be burning fossil fuels. Even hydrogen isn't completely benign [4] as it depletes those "negative ions" that were a fad in the 1970s and as a result prolongs the life of other greenhouse gases.
[1] https://en.wikipedia.org/wiki/Electrofuel
[2] https://blog.sintef.com/energy/thermochemical-energy-storage...
[3] https://en.wikipedia.org/wiki/Fischer%E2%80%93Tropsch_proces...
[4] https://www.nature.com/articles/s43247-023-00857-8
TCES is extremely cheap (megajoules per dollar of rechargeable capacity, compared to tens of kilojoules per dollar for batteries) and will likely be critical to the energy transition, but it won't run your servers overnight or your airplanes over the Atlantic. Things like cycle life are still an issue for TCES. And you need electric power to run your TCES system, so efficiency is still a concern.
I feel like oxidizing methane to methanol shouldn't be rocket science, especially if you have an unlimited amount of energy to use. NADPH does it at scale already. A few more cheap process steps would give you nontoxic ethanol instead.
Solar and wind are susceptible to hyperhurricanes though which make them useless near coasts.
For those who are wondering:
Hypercane
From Wikipedia, the free encyclopedia (Redirected from Hyperhurricane)
A hypercane is a hypothetical class of extreme tropical cyclone that could form if sea surface temperatures reached approximately 50 °C (122 °F), which is 12 °C (22 °F) warmer than the warmest ocean temperature ever recorded.[1] Such an increase could be caused by a large asteroid or comet impact, a large supervolcanic eruption, a large submarine flood basalt, or "incredible" global warming.[2] There is some speculation that a series of hypercanes resulting from the impact of a large asteroid or comet contributed to the demise of the non-avian dinosaurs.[3] The hypothesis was created by Kerry Emanuel of MIT, who also coined the term.[4][5][3]
And so is everything else. Solar and wind are generally much more geographically distributed so are less vulnerable.
Probably if there are hyperhurricanes what you need is not a terrestrial power plant but an O'Neill cylinder.
Absolutely.
Even if fusion reactors can be made reliable, they'll still require a huge amount of investment with a payoff decades into the future. Meanwhile, you buy some cheap solar panels and can get returns on your money almost immediately.
Yes. Laypeople often forget about this fact. A big reason that we're not building fission plants is because you won't get anything back for a decade or more. And by that time, will it still even be competitive?
> Make it cheaper than the alternatives.
I wish SMR gets more attentions and focus. The UK has more Wind resources to power all four nations and it is on course to double its current 30% electricity from Wind to 60% by ~2030. I dont see why it cant be 90%+ by 2040 barring policy, politics and human factors.
All we need then are SMRs for base loads, while we work on energy storage that could be battery or Dams.
SMRs have been complete vaporware for the past 70 years.
https://spectrum.ieee.org/the-forgotten-history-of-small-nuc...
Or just this recent summary on how all modern SMRs tend to show promising PowerPoints and then cancel when reality hits.
https://www.youtube.com/watch?v=XECq9uFsy6o
Simply look to:
- mPower: https://en.wikipedia.org/wiki/B%26W_mPower - NuScale: https://oregoncapitalchronicle.com/2024/10/29/the-rise-and-f...
And the rest of the bunch adding costs for every passing year and then disappearing when the subsidies run out.
Not sure why you would want to fill storage with extremely expensive nuclear power when we might as well just use cheap renewables.
GE Hitachi’s small modular reactor (SMR) provides 24/7 on-demand, carbon-free power.
https://www.gevernova.com/nuclear/carbon-free-power/bwrx-300...
Which is wholly irrelevant if the costs are so high that no one will buy the power.
Which is especially true considering how renewables and storage is plummeting in cost seeing incredible growth.
I'm not an expert, but I actually feel like it's not terrestrial power generation where fusion would be useful. We have plenty of ways of generating power here, including a massive fusion reactor up in the sky which will continue operating for the foreseeable future :)
Instead, I think need to find some way of generating a lot of power over extended periods of time if we want to crack space travel beyond lobbing items into Earth orbit. Particularly for human space travel.
We need significantly more energy density than what chemical motors can provide, and nuclear is the easiest and most "within-reach" that we know of. Things like anti-matter and even more exotic forms of propulsion are pure sci-fi for the time being (in terms of practicality). I suppose fission could work in a very primitive way of basically detonating small nukes behind a pusher plate, but fusion feels so much more elegant (at least to a layman). Mostly in the sense of not having to use effectively the same nukes we stick in weapons. Something tells me that this will never fly (excuse the pun), for obvious reasons.
> including a massive fusion reactor up in the sky which will continue operating for the foreseeable future :)
I assume this refers to the sun. Other responses don't seem to have made the same conclusion.
You are implying ion drives I assume. Ion Drives work by accelerating particles to _really_ fast speeds (like fractions of light-speed) and use that to generate thrust. Meaning it requires almost no "fuel" but requires A LOT of power for the amount of thrust generated.
A fusion power generator coupled with ion drives could potentially generate a lot of thrust for little mass since the fusion fuel also has very little mass.
https://en.wikipedia.org/wiki/Ion_thruster
He might be talking about a "direct fusion drive," which is not an ion drive. It's basically a magnetic-confinement device but you leave part open in such a way that fusion products exit to produce thrust.
For example, here's a fission-fusion one that looks like it only needs engineering to be workable:
https://www.nasa.gov/general/pulsed-fission-fusion-puff-prop...
In fact I meant it in a general sense, and both ion drive and a more classic confinement engine like design that expels the fusion products. I don't know enough about the intricacies to argue for one vs the other, but fusion fuel is the most energy dense fuel we have within reach, by a huge margin. The variable then becomes the mass of the engine (and supporting plumbing), but we have control over this.
Can we shrink a fusion reactor small enough to make the rocket as a whole feasible? I don't know. My hope is that for space travel, we can make certain tradeoffs that are not feasible on Earth - e.g. expelling radioactive waste products into the environment. A rocket engine doesn't need to be all that efficient at generating electricity either, it mostly needs to be efficient for propulsion, which might make it's design much simpler.
Even if you had a fully functional reactor in space you now have a massive thermodynamics problem on your hands. How do you get rid of all that heat?
It's difficult to see how DT fusion could be better in space than alternatives. DT fusion reactors will be extremely large and complex. They will be much heavier than and much less reliable than alternatives. That's not good for space applications.
The high energy density of the fuel is irrelevant if the reactor is very massive and if only a tiny fraction of the reactor's mass in fuel can be fused over the lifespan of the reactor. IIRC, it would take something like 700,000 years for ITER to fuse its own mass in fuel, if it could be operated continuously forever (which it can't, of course.)
You are correct, but I wonder if space travel applications allow us to take certain liberties with the reactor design that are not feasible down on Earth. E.g. a simple(r) confinement design that expels the fusion products could work well in space, but is a non starter on Earth for obvious reasons.
I'm not sure how much of the design on Earth is bulky and cumbersome because it needs to be vastly safer and more complicated than what a rocket engine needs to be. Taken to the extreme, nuclear propulsion in space can be very dumb: just detonate nukes some distance behind the rocket and ride the shockwave with a pusher plate. It's a radically different design than anything we'd attempt for terrestrial applications.
Near-earth space has an abundance of solar energy convertible to electrical energy. Deep space is an interesting potential use case for fusion, but fission-powered electric space propulsion via RTGs has the virtue of having demonstrated nearly half a century of continuous operation
(and after it leaves the earth's atmosphere, radioactivity ceases to be a concern)
As far as I know there has never been an RTG powered electric rocket engine. RTGs are extremely expensive per unit of power output. Spacecraft like that ion engine propelled one that was sent to Ceres use solar power.
Voyager and Pioneer used RTGs, as well as some more recent longer range missions and the planned Dragonfly to Titan. Agree that solar has much better unit economics, but I don't see any reason why the same wouldn't be true of fusion too.
And none of those use RTGs to power electric rocket engines.
(EDIT: thrusters are rocket engines, just very small ones. All those missions used small chemical rockets for maneuvering, using storable propellants like hydrazine.)
Edit: I stand corrected on Voyager/Pioneer exclusively using hydrazine, though you could get similar thrust from a modern HET without needing insane amounts of RTG power, and run other electrical equipment off that extra power when you've reached target velocity and aren't firing the thrusters for attitude control. The higher Isp of the EP system is an advantage too...
I dont want to undermine your enthusiasm.. But unfortunately, even fussion based energy generation looks like sci-fi. There is reason why certain things happen in objects of certain mass. You cant cheat on physics.
The neutron corrosion alone is very serious problem that I doubt can be fixed. There is only one material that can withstand it, its called neutronium and we all know why we cant even use it on earth, not to mention about producing one. The Li-7 trick they are currently trying to use is not convicing me. Whats the point of it if you need to stop reactor every few years to replace entire Li-7 shielding. Lithium itself aint cheap and we basically will destroy it in such reactors (non-recycleable).
The hope lies in Fission Reactors, breeder one. Unfortunately, there is big tabu here because those can be used for enrichment, and so, for nuclear weapons.
The materials question is interesting to me because my Masters is mostly related to Material Science. My thesis advisor did some research in this area--mostly with ZR3AL as I recall. My thesis was with a related material (NiAl) for potential uses related to turbines. The aluminide alloys never really worked out for a variety of reasons. My advisor ended up pretty much doing water ice-related research full-time (which he had just started doing when I was a grad student) and that's pretty much what he spent his career doing. The US Army has? a lab in Hanover NH and, at the time, they were very interested in better understanding the properties of ice, in part, presumably because of the possibility of a land war with the Soviet Union in Alaska.
> The neutron corrosion alone is very serious problem that I doubt can be fixed.
It can be fixed, because some fuels don't produce neutrons. But aneutronic fusion is like two orders of magnitude more difficult than Deuterium-Tritium. Still, it might be necessary for any reasonable application. One of my professors was a big proponent in aneutronic fusion even for Tokamaks.
> They listed 5 challenges but they forgot probably the biggest one-- Make it cheaper than the alternatives.
Does it really need to be cheaper if it's more promising?
promising what?
It needs to promise to eventually be cheaper, or otherwise better in some niche.
I can imagine it being useful for spaceships, for example.
But I can't see it ever being cheaper than eg solar, without some kind of breakthrough, like cold-fusion.
The significant issue is that DT fusion reactors are very large compared to fission reactors. If we look at gross nuclear power output per unit volume, comparing a fusion reactor to a PWR primary reactor vessel, the comparison is stark. ITER, for example, has a power density 400x worse than a PWR; the 2014 ARC design is 40x worse.
This large size has devastating implications. The nuclear island of a fission power plant is maybe 12% of the cost of the plant. Increase that cost by an order of magnitude and you've doubled the cost of the plant. Commercial nuclear power is already not competitive; double the capex and it becomes ludicrously uncompetitive.
If someone presents you a fusion proposal, the first thing you should ask is "what's the volumetric power density of your reactor?"
If you look at the arguments purporting to show DT fusion could have a chance they do things like assume they can build everything for 2x the cost of materials. Yeah, good luck with that, especially when that cost estimation technique would give wildly low results if you applied it to fission power plants, which are much less complex.
As I see it, there's only a couple of ways fusion might make it. The first is an approach that makes minimizing reactor size the primary goal. This would involve very small plasma configurations and one where all plasma-facing surfaces are covered with thick layers of flowing liquid lithium so areal power density can be very high, at least an order of magnitude higher than most concepts. Among DT efforts, only Zap looks like it might have a chance here (but there are parts on one end of the reactor that are still exposed to neutrons, unshielded.) The experience with liquid sodium in fission reactors should give one pause about making this work, though.
The other approach would be to avoid non-nuclear parts than fission power plants have by exploiting direct conversion of plasma energy to work, so the apples-to-apples comparison of heat sources does not apply. Helion is the front runner here, using more advanced fuels that put most of their energy into the plasma, not into neutrons. I consider Helion the least dubious of all current fusion efforts.
Also relevant: The Trouble With Fusion [1]. It's from 1983, but could well have been written today, as nothing has really changed in the grand scheme of things. In short, there are multiple technical and economical challenges that stand in the way of nuclear fusion becoming a viable path towards clean energy, even after the first successful demonstration of a self-sustained burning plasma. Which, by the way, still seems about 10 years out, with the latest round of delays on ITER. I think most of the people involved understand this, but there is ongoing political will to sink billions of dollars into these projects, so here we are.
1. https://orcutt.net/weblog/wp-content/uploads/2015/08/The-Tro...
Lidsky is long dead (as are Pfirsch and Schmitter, who had similar critiques in Germany), but I like to think he'd have been a supporter of Helion. Helion's approach is close to what Lidsky advocated (aneutronic fusion), at least before Lidsky's student Todd Rider showed truly aneutronic fusion was not likely to work. However, the Helion approach has greatly reduced neutronicity, particularly of energetic DT neutrons, and exploits the kind of non-Maxwellian plasmas w. energy recovery/recirculation that Rider was analyzing.
All well and good, but Commonwealth Fusion and Tokamak Energy are building large test reactors are we speak in hopes of answering these exact questions. These aren't science projects, but tech demonstrators designed as a stepping stone to commercial fusion. The points raised in the article are valid, but can only be tested by building reactors, which is currently happening in the above companies.
Agreed, but the problem isn't about funding fusion research - which isn't a bad thing IMO - more that the UK government seem to believe that it's "coming soon". The concern is that they're basing future energy policy on a technology that may not be a practical power source for generations (or ever).
See also their recent commitment to using AI. Funding AI research and supporting AI startups: sure. Basing policy on the fact that AI can replace civil servants and other workers: there should probably be some evidence of that first.
Worth noting that the author of this letter works at Culham Centre for Fusion Energy, where JET is located. 'Abingdon' was a clue but they could have filled in the detail.
There’s good old #4 again: fusion processes inevitably create extreme neutron fluxes capable of turning everything they touch into radioactive waste. Meaning the reactor vessel and all involved fluids and support structures for handling thereof.
Congrats! All the hassle of fission with more steps. Proceed to scream at me!
Yeah, that's something I've been wondering about for a while. Sure you can get Fusion to create energy eventually, but the engineering to keep the reactor running long enough to break even on cash sounds even more difficult. Anything involving hot plasma and large amounts of ionizing radiation is going to play hell with your equipment. That's not even considering the secondary nuclear waste created by the neutron, proton, and alpha radiation.
It just seems like a great way to turn a billion dollars into a pile of extremely hot, radioactive, radiation embrittled metal.
Radioactive waste isn't an intractable problem. It is simply a problem we refuse to deal with.
There is also fusion with low neutron flux [1] (among the others: Helium 3) but it is even more difficult to achieve. I'm not confident that any of us will ever see a commercial fusion reactor. I'm afraid that all we can do to harness fusion energy directly is by the means of solar panels.
[1] https://en.wikipedia.org/wiki/Aneutronic_fusion
There are a few things I don't get of the fifth point: why do we need to be able to operate nuclear fusion plants remotely? Wouldn't it be not just possible, but maybe even preferable, to have staff on site to manage the plant, like it is the case for fission plants? And if we still want remote control, why is this such a big challenge? With a background in computer engineering and cybersecurity, this seems to me the smallest challenge of the list.
I think they mean that we shouldn't have to send a guy inside the reactor to clean and reset it up every time you fire it. I don't really know why they listed this point.
Seeing it from this perspective, maybe I now see it: the residual radioactivity inside the reactor may be strong enough to mess with the electronics of a robot you're sending inside. Maybe there are challenges in getting enough shielding while keeping the robot reasonably sized and maneuverable.
A major challenge is getting the robot into the place you need it, along with other objects it needs like replacement parts, in between the other infrastructure like coils and cooling blankets. You mustn't damage the delicate reactor structures, you can't have rails and so on always in the reactor vessel, the vessel has to be vacuum tight, and most of all you must not allow the robot to fail and be unextractable, because humans can't go in an pull it out.
ITER seems to have a project called the Agile Robot Transporter which you can see some renders of that demonstrate how fiddly this problem is.
This presentation has some useful illustrations about the ART:
https://indico.iter.org/event/45/contributions/1090/attachme...
Maybe because of security, and the entire thing needs to be airgapped.
Remote handling might refer to robotics or remote control done on site. Fusion does produce neutrons that will generate some radio activity in structures and so on. Minimising humans needing to be close these is reasonable move.
It's more to the point that disruptions tend to be very disruptive, too frequent, not well understood, and the reset takes a really long time. But you're right that this does not quite belong. If we solve the technological problems, the maintenance should resolve itself.
> like it is the case for fission plants
Fission plants are operated remotely. IE, that's what the stereotypical "control room" of a fission plant is.
is it remote from the reaction chamber? i.e. if people have to go in to wipe/re-coat stuff, it takes longer to do it safely, but it's cheaper/faster if you can get a machine to do it with the people controlling it from a distance. still, doesn't seem terribly hard, when surgery has been done with remote controlled robots.
Then there might be a “DeepSeek” moment.
Yes, but just throwing in the possibility of a future unexpected breakthrough without grouding this in the situation so far, can lead to wishful thinking. The fact that the list has 5 independent points means that you need 5 independent breakthroughs, making it even less likely to happen in the short term.
For a DeepSeek moment, the west would first have to produce a functional, but Wildly expensive fusion reactor. Then someone else could ask the reactor questions over and over until they understand well enough how the reactor works.
The DeepSeek moment for nuclear power might be accelerator-driven subcritical reactors that make fission plants much simpler and safer (and therefore cheaper, the key element of DeepSeek moment) by never having a critical mass of fuel in the reactor - criticality is only possible when a particle accelerator is illuminating the core, and the accelerator can just be turned off.
Helion are saying they may have commercial energy shortly although no one really believes them.
I find all this discussion... well... amusing.
There may be a reasons we don't have fusion energy that have nothing to do with the physics or engineering.
What if I told you "fusion is not (perpetually) 20 years in the future... it's 60 years in the past and we missed it." ?
There's a story here, but it requires a bit of suspension of orthodox belief:
https://waterstarproject.com/read-the-waterstar-manifesto/
https://waterstarfoundation.org
-- Paul Schatzkin (actual name!)
Meanwhile, in Massachusetts...
https://news.mit.edu/2024/commonwealth-fusion-systems-unveil...
What happens if fusion reactor meets a bomb?
All money being spent on fusion research would be much better spent building the most modern and safe fission reactor designs.
there's plenty of money for both. fusion is too important to eschew
not if climate change is as bad as it could be. Fission reactors can actually reduce CO2 emissions. Fusion reactors not so much.
I agree with all this. People make two big mistakes when evaluating fusion:
1. The fuel is free to cheap, depending on the fuel; and
2. Stars can do it so surely we can.
So let's consider a mythical plant that costs $30B to build, has a lifetime of 30 years, costs $3B/year in maintenance and produces 1000MW of power. And the fuel. is free. Is the electricty free? Not even close. That's ~8.8GWh per year. No plant runs at full capacity 24x7 so let's call it 4TWh. That's about $1/kWh. That's ludicrously uneconomical. Of course you can throw in vastly different numbers and vastly different results but the main point is that fuel cost is a non-issue.
And as the article points out, in a D-T reactor, you still need to produce tritium because it's basically not naturally occurring in a way that can be gathered. That means a fission breeder reactor. That fuel isn't free.
Second, "the Sun does it" misses the point. As the article correctly points out, damage from neutrons ("neutron embrittlement") is a huge problem. Beyond that, it's a huge problem because every high-speed electron that escapes the plasma is energy loss from the system.
Stars use gravity for containment.
tl;dr: A British politician suggested we are "within grasping distance" of using fusion for wide-scale power generation; the letter published in the Guardian explains how significant challenges remain both with the core process and surrounding considerations. (i.e. letter explains why we are "30 years away" at least, not 3 years away)
Nuclear fusion has been 3 decades away since as far as I can remember (I believe I first read about it in National Geographic in the late 70s or early 80s)
then in Deepseek like fashion China comes along and does it for 1/10th the cost and time.
In the same time the UK has been arguing over a third runway at its biggest airport, China has built over 100 new airports (20 years).
China also has an order of magnitude more land available, and a fascist government. Arguing over squeezing a new runway into a built-up area is a good thing.
I seem to remember that in socialist Eastern Europe it was only always 15 years away ;)
You will note I said "at least" rather than "at most" or "about" ;-) ... we're all familiar with the phrase I believe
Yeah I agree
The hard part here is twofold: there are several orders of magnitude needed to be overcome (in efficiency) for it to work. Also the learning curve in fusion is not a curve, but more like a staircase with very tall steps
ITER is gearing up to be a great disappointment
I really don't feel like magnetic confinement is the way to go, unless we get some new superconducting technology and need to create electromagnets with 100x their current strength
We actually do have commercially available superconductors that support much stronger magnetic fields than what ITER is able to do.
I don't think it's 100x, but it doesn't have to be. Tokamak output scales with the fourth power of magnetic field strength. Double the field, 16X the energy output. CFS and Tokamak Energy are building tokamaks with these.
Tokamak output also scales with the square of reactor size, which is why ITER is so big. These new reactors can be much smaller for the same output, so they'll be much faster and cheaper to build. The CFS design is about the size of JET, which was built in four years, with three of those being just for the building.
Why doesn't ITER use the better superconductors?
Because they completed the initial design in 2001, and broke ground in 2007. The superconductors became available in the early 2010s.
They should have redesigned it. Spending so much money on a fundamentally inferior design is pretty stupid.
That's a good way to never actually complete a big project.
It is already incredibly far behind schedule.
It can't be 100x. In designs like ARC the magnets are up against limits not from the superconductors, but from the structures needed to resist JxB forces. The majority of the mass of the ARC reactor is in these steel structures. Much progress beyond that would involve significant increases in strength of materials, and even then the strength needed scales as B^2.
100x ARC would be ~2000 T. The magnetic pressure of such a field would be about 100x the detonation pressure of TNT. Experimentally, fields of such magnitude could only be produced very briefly in devices that near-instantly explode (for example, devices driven by intense laser pulses that produce large charge separations.)
> Tokamak output also scales with the square of reactor size
ITER is big because it wouldn't work at all if it were smaller (at the magnetic fields involved). Output scaling as (linear dimensions)^2 means you want your reactor to be as small as possible, to maximize power/volume. Square-cube law in operation. The higher magnetic field in HTSC tokamaks allows then to be smaller. Volumetric power density is still poor, though, just not as terrible as ITER.
> unless we get some new semiconductor technology
Did you mean superconductor technology?
Yes, fixed
We see already, that Tokamak's will be very expensive to build and operate.
[dead]
[flagged]
politician == lying idiot (esp Ed Miliband)
So, some scientists lied to a politician? Considering how much politicians lie, I think what they did is normal and almost OK.
Ed Miliband, the liar quoted in the letter, is a politician and not a scientist. The author of the letter, a scientist, is telling the truth.
I don't see any scientists lying mentioned in the article, where did you get that takeaway? We see policymakers ignore competent advice on all fronts all the time, I wouldn't be surprised if that's the case here too.
I probably read wrong.