(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=39647331

目前,使用锂进行聚变仅是某些实验项目的可能性,在商业上绝对不可行。 聚变氚和氘会产生氦离子和中子以及其他产物,但是将所需产物(氦离子)与工艺流的其余部分分离仍然是一项具有挑战性的技术挑战,尚未完全解决。 虽然这些挑战不一定是不可克服的,但它们代表了聚变作为能源的重大障碍,特别是在短期范围内。 此外,尽管不断尝试通过回收和重新利用其他技术来解决这个问题,但某些材料(例如氚、锂和铍)的稀缺性和高成本使问题变得更加复杂。 最终,推进聚变能前沿是一项复杂且多方面的努力,需要同时解决众多技术障碍。

相关文章

原文
Hacker News new | past | comments | ask | show | jobs | submit login
Tests show high-temperature superconducting magnets are ready for fusion (news.mit.edu)
226 points by paulsutter 1 day ago | hide | past | favorite | 100 comments










Is there a good reason we aren't seeing REBCO magnets in all new MRI machines? Seems like not needing liquid helium would be a pretty big win for hospitals?


They probably still can't carry the necessary current without helium cooling. There are some NMR spectrometers (which work essentially by the same principles as MRI machines) with high-temperature superconductors, but those are still cooled by helium. The lower the temperature is the more current you can pack into those.

Could also be that it doesn't make economic sense yet, they're still pretty new compared to classical superconductors.



MRIs don't consume much helium. Keeping them topped off is a trivial expense compared to the rest of the costs of the machine and operation. If there were a new design that didn't require helium, it would also have to cost less than current machines to make sense.


There are already non-venting systems that use a reasonable amount of helium (7L) to manufacture and don't require refilling. It just was not a design consideration previously. https://www.philips.com.au/healthcare/resources/landing/the-...


While 7 liters is less than 1,500, why would they call this a helium-free system?


They say helium free operation, as they say you never need to deal with the helium (e.g. no refilling)


Yes: Commonwealth Fusion essentially is buying the world's supply of it.

MRIs are a more mature market, i.e. > 1 units, and need to justify component switches not just for feasibility, but for profitability. Swapping out a known-good for something in such short supply isn't feasible, it's more expensive than liquid helium.

Source: https://spectrum.ieee.org/fusion-2662267312 "Over two years, the team managed to buy up most of the world’s supply of 4-millimeter-wide HTS tape"



Commonwealth Fusion is such a Fallout name


Todd is in shambles that he can't use the name.


I would say it's because the technology has not been matured enough.

I would also say the reason CFS has been getting funding is not because of fusion (although, hey, why not take that longshot bet), but because of the value of being able to make these magnets for non-fusion applications.



> Is there a good reason we aren't seeing REBCO magnets in all new MRI machines?

REBCO tapes are still very brittle, and MRI machines are notorious for the amount of banging they produce when active.



???

the gradient coils make the noise, and due to field homogeneity requirements, it is safe to assume the superconducting coil is not moved by the gradient coils



homogeneity is only optimized for a small volume within the bore. Regardless, the superconducting coils will still experience a change in Lorentz forces as sequences are run, and noise from other components will still be "heard" by the main magnet


It's still very much affected by the vibrations, and the electromagnetic forces from the gradient coils.


Wait, isn't the byproduct of fusion helium anyway?


Yeah but you'd power a city for a year for 100 kg of helium.


Fusion produces much, much less helium than that. D-T fusion that produces 100kg of He-4 would produce something like 25PJ of net primary energy. If you managed to turn 20% of that into electricity, you'd have 5PJ. This is within an order of magnitude of the yearly electricity consumption of the entire world.


I do actually have receipts. For example numbers I picked Washington DC, which has 700k people that each use an average of 600 kWh per month. That's 5 TWh / year, which I rounded up to 10 TWh. This yields 475 m^3 of helium, which is 85 kg. I rounded up to 100. DC is a modest city and all of humanity uses 15000 TWh / year (of heat). This "100 kg / city / year" estimate is even on the lower end since I started with electric consumption and never put in the factor of 3 to convert to thermal.

Your mass-energy conversion number is within error of mine, but your estimate of how much power we use is much lower.

https://www.wolframalpha.com/input?i=%2810+TWh%29+%2F+%2817....



5PJ is not that large, only 1.4 TWh.


In contrast, world primary energy consumption is at a rate of around 20 TW, or five orders of magnitude more than that per year.


A single MRI needs around 1000 liters, 1L is roughly 130g. For those who wonder prices vary between $15-35 per liter. Recent MRIs have recycling systems that reduce refill needs by 90% when they work properly.


So about $15k for 100 kg?


> Like virtually all electrical wires, conventional superconducting magnets are fully protected by insulating material to prevent short-circuits between the wires. But in the new magnet, the tape was left completely bare; the engineers relied on REBCO’s much greater conductivity to keep the current flowing through the material.

Much greater conductivity than what? I assume there are other non-REBCO layers in the tape, but the article neglects to mention them.



Than stainless steel!

If you have two resistors in parallel, the current in each is proportional to the ratio of the resistances. But what if one of the resistances is zero? Then all the current goes through that resistor. So if you have a superconductor in parallel with another resistor, all the current goes through the superconductor! Doesn't matter if the other resistance is very, very low, but still nonzero.

Reportedly some undergrad thought of this.

So the superconducting layer is bonded to stainless steel tape, which is strong, not brittle, and will go down to cyrogenic temperatures without problems. Most previous superconducting "wires" had brittle ceramic insulation, which was hard to wind into magnets.

This is really clever.



I think there's another level to this, when you put a superconductor in contact with a metal there's a proximity effect that lowers the Tc of the superconductor. If you're using a superconductor that has a 4K Tc then you cannot afford to have your superconductor be weakened by the proximity effect. So the insulation is to prevent the proximity effect from happening.

Here, since you're using a much higher Tc material, if Tc is lowered by a few K you still have plenty of thermal budget, allowing you to take the hit from the proximity effect but be able to save on insulation.



The other thing the normal conductor does is save the coil from exploding if the superconductor quenches and loses its superconductivity (1/2 L I^2 can be a lot of energy). Old low Tc superconductors used copper for this, I believe.


Do any of these principles apply to liquid nitrogen applications?

The article outlines a plate design that incorporates steel/REBCO windings and cooling channels for liquid helium that seems to be modular and easily serviced.

REBCO is actually listed as a high-Tc material in the wiki below, but requires liquid helium.

If this were reworked into a liquid nitrogen design, could it reach 20 Tesla?

https://en.m.wikipedia.org/wiki/High-temperature_superconduc...



I think a better question is whether it's compatible with hydrogen, because 20K is plenty cold enough for REBCO, just not for classic Type-1 superconducting wire in magnets.


There is a layer of steel on the bottom that provides the structure for the tape and a layer of silver on top of the REBCO ceramic layer for electrical connection, but the whole tape is coated in copper which is the secondary conductor. When superconducting, the tapes, which are wound in a coil, carry the current and the copper acts as an insulator between tapes, but if it stops superconducting, the copper can become a resistive conductor which helps prevent quenching.




During a quench, almost all of the current is carried by the copper cap.

https://ieeexplore.ieee.org/document/10316632



Stainless steel. Which is a perfectly good insulator when the alternative is REBCO.


Much greater than previous superconductors I would assume.


I saw this presentation by Dennis Whyte years ago and it left me really optimistic about Sparc/Arc's chances...

He goes through a lot of the interesting design details, including why REBCO is such a huge difference and will make their commercial reactor design possible:

https://www.youtube.com/watch?v=KkpqA8yG9T4



The only thing is that there are tons of problems beyond just having strong magnets. Using REBCO tape for the magnets indeed solves the problems of reactor size by letting the reactor be much, much smaller than say ITER. But it does not solve the other problems of reactor design and operation.


I'm sure that's true.. but at least in this presentation they seem to have thought through most of them? I want to believe :)


There are huge hurdles in maintaining reactors, even once you get them going in the first place. The fusion reaction causes many components of the reactor to decay and lose performance or fail.


It's kind of funny that that image looks an awful lot like the housing of a Mazda rotary engine.

I'd be curious if MIT / Commenwealth Fusion have made any progress on their "remountable magnets" approach.

The problem is that if you have to replace parts of the reactor, the magnets get in the way. I think they found that you could just solder the ends of REBCO film to each other and even though the solder isn't superconducting, the layer can be thin enough that it doesn't matter. So, they were looking at technology to take a tokomak apart by desoldering the REBCO, then they can replace whatever parts are inside, then solder it back together when they're done.

That seemed like a pretty hard problem, and not one they're trying to solve with SPARC (which is only meant to be a cheap prototype reactor, not something designed for a long service lifetime).



> It's kind of funny that that image looks an awful lot like the housing of a Mazda rotary engine.

You at least have to give the poor guy credit :). Wankel engines can be found all sorts of places including planes!

https://en.m.wikipedia.org/wiki/Wankel_engine



Sure, but I've only seen variations of the Mazda 13B disassembled up close. I don't know if other Wankel engine housings look quite the same.


It’s interesting that it took 3 years from the physical test until paper publication.


Publications often take a long time. Particularly novel topics that are difficult to review due to a lack of comparative material or willing expertise.

The fact that it took 3 years is a positive, as it means multiple parties rigorously reviewed and approved the material.



Url changed from https://futurism.com/the-byte/mit-magnets-ready-fusion, which points to this.

(Normally we prefer the best third-party article to a press release but the balance of information here points the other way.)

Edit: I should say that if there's a more informative third-party article, we can switch to that.



If someone is knowledgable on this: What does this mean for ITER? Any chance to retrofit new magnetic technology into ITERs design?


I'm not super knowledgeable, but my understanding is that ITER was designed around the best superconductors known at the time, which meant making the whole thing really big.

A benefit of stronger magnets is that you can make it a lot smaller, and presumably cheaper -- which is what MIT has been working on with SPARC.

I don't know if it's practical to upgrade the magnets on ITER, but I'd expect it to be really expensive -- especially if they've already manufactured/installed the old magnets.



It would be completely impractical, since ITER is not designed to withstand the much larger JxB forces (which scale as B^2).

In the high Tc high field designs, the mass of the metal supports for the magnets dominates the reactor mass.



Thank you, that's the kind of point I was looking for.

Why does JxB scale quadratically? Because a stronger field can contain (linearly) more current?



If you double the current J, you double the magnetic field B.

But the JxB force is now quadrupled, because you doubled both J and B.



It means nothing for ITER unfortunately. Changing the toroidal field strength implies a totally different plasma size, density, temperature, everything. ITER isn't designed for this strong of fields and the plasma they would confine.


Nope. And this is fine.

ITER is not a prototype of a commercial fusion reactor, it's a burning plasma laboratory. Its main goal is to study the plasma confinement technology.



ITER is also supposed to be the final step before a commercial reactor. If SPARC works well and they can keep the pace they're at to make ARC, then ITER might be a dead-end in that regard, no?


We already have enough experience that makes ITER obsolete.

Newer designs will have a much thinner central column and stronger magnets, resulting in much better confinement. And high-temperature superconductors will an order of magnitude cheaper.

Right now, ITER is still needed to get information on plasma properties at the temperatures required for fusion.



Iter's is supposed to have a successor, DEMO, which would be the final step before a commercial reactor. It is doubtful whether DEMO will ever be built.


BSSCO is superconducting at much higher temperatures. Why are they not using that?


"In practice, HTS are limited by the irreversibility field H, above which magnetic vortices melt or decouple. Even though BSCCO has a higher upper critical field than YBCO it has a much lower H (typically smaller by a factor of 100)[11] thus limiting its use for making high-field magnets."

https://en.wikipedia.org/wiki/Bismuth_strontium_calcium_copp...



20 K is high temperature? Well, I better take my mittens and ushanka off. I thought 77 K was the official threshold: https://en.wikipedia.org/wiki/High-temperature_superconducti....


You'd never operate a superconducting magnet at its crtical temperature.

Check out figure 5 here. The relevant bit is the vertical slice along the left edge. Crtical current density and internal magnetic field both go way up when you go below LN2 temps. You could make an LN2 HTS magnet, it just wouldn't be in the 1T+ class.

https://cds.cern.ch/record/2277484/files/Bruker%20HTS%20Fit....



...in thirty years.


People have been saying this for 5 years on every Commonwealth Fusion article as they hit every milestone exactly on time. It's happening this time, people rushing to make fusion jokes are getting downvoted because it's apparent you're just keyword-matching fusion then snarking.


I went to Abingdon School, which is quite close to Culham labs (JET) in Oxfordshire, in the mid 1980s. One of my A levels was Physics. We had some boffins from JET pop on over and do a presentation to us about what they were up to. I was one of the thickies. Some of my (then) compatriots are now dons and even proper scientists! I only managed a random walk into IT.

I dimly recall a graph showing progress on how they were able to fire up the thing and the energy out compared to energy in. That graph had quite a decay on it initially but a really long tail. That long tail is where the always 25 years off meme comes from.

Nuclear fusion as an energy source is a massively complicated beast. You might like to note, say, that building for Hinkley Point C (a fission plant, Somerset, UK - just up the road from here) was started late 2018 and was due to be commissioned in 2023 and now is due in 2030. You might also note that aircraft and the like tend to take 20+ years to go from concept to service. A nuclear fusion plant will take quite a while to commission once a concept has been thrashed out.

Perhaps CF have got it sorted but there is a damn good reason for the always 25 years off "joke" - I was told it over 30 years ago by physicists working and already long time served at the coal face.



Trust me, this time is different.


The problem I have can be see by going back to the original ARC paper. That reactor was much smaller than ITER, and had 10x power density, but was still the mass of several WW2 destroyers (for net 190 MW(e)). Its volumetric power density is still 40x worse than an existing commercial PWR.


I think they're getting downvoted because the joke is getting really worn out and endlessly dismissive cynicism is just annoying.


> People have been saying this for 5 years on every Commonwealth Fusion article as they hit every milestone exactly on time.

Interesting, so what is their timeline for commercial fusion?



~2030 IIRC, unfortunately Google results are drowned out by N copies of this piece.

Ah, found this: - https://cfs.energy/news-and-media/commonwealth-fusion-system...

from broader context I remember, the 2025 "commercially relevant net energy" means "net positive, and not trivially so, i.e. not for 2 femtonanopicofemtoseconds for 1 of 30 trials if we had a better laser"



I really wish people involved in fusion research, and their university/corporate PR colleagues, had not spent the last forty years poisoning the fucking well. fusion power is still so far away that it can't even figure in to our plans for reducing emissions to zero, and yet people still publish articles like this to muddy the waters and make it sound like it matters in the short term at all.


The real question is whether it can ever compete against fission in terms of cost. Currently it looks like fusion power plants will be substantially more expensive than fission power plants.


How does that change if you account for waste disposal in the costs? From what I gather the nuclear waste of fusion has a much shorter half-life. That should make storage and disposal much easier.


Very little, in practice: the waste disposal of fission reactors is one of the ways that the energy density of nuclear fuel actually matters — there's hardly any by volume, even if you add up the entire world's nuclear waste.


Problem with fission waste is it's super long decay duration. It will basically still be seriously dangerous way beyond the lifespan of its containers. We need the equivalent of the Manhatten project to build next generation facilities to burn down the waste. The original bomb scientists really wanted to go with thorium as it has several safety features. One it needs to be pushed to criticality so it can't go runaway. Two its waste is much safer in its duration. And now cost is becoming a factor, as solar, wind, battery storage, geothermal, ocean thermal and conservation all pile on with cost efficiencies that make fission rapidly too costly. The fission waste will still need to be processed and then stored in ways that can be hoped to be safe for eons.


> Problem with fission waste is it's super long decay duration.

Some of it does, the decay chains are wildly variable. Fortunately, the longer the half life, the lower the power output. If Cobalt-60 and Plutonium-242 released the same energy per decay event and you had the same number of atoms at the start, the radiation from the cobalt would be 72,000 times as intense to start with, but equal after 84 years.

But the main point, all things nuclear are so energy dense that in practice there's (relatively speaking) so little of the waste it basically doesn't matter. I've been in single rooms large enough to contain all the world's high level nuclear waste.

> The original bomb scientists really wanted to go with thorium as it has several safety features. One it needs to be pushed to criticality so it can't go runaway. Two its waste is much safer in its duration.

That seems like a flaw for a bomb, which is specifically it going runaway.

But also, it's quite hard to crush uranium or plutonium hard enough to do that. Even with Chernobyl, the fuel was surrounded by stuff designed to make it more energetic (although not as energetic as the actual accident, obviously).



Batteries are hardly getting cheaper anymore. See https://www.energy-storage.news/lithium-battery-pack-prices-...

The allegedly low cost of solar energy comes from ignoring battery cost. And the relatively high cost of fission comes largely from overregulation.



Going up one year isn't something you should make a trend from, especially given the long term price declines.

Here's another graph of the same thing, but updated with the actual rather than predicted value for one year after your link was published — and the price decline is back: https://www.statista.com/statistics/883118/global-lithium-io...

> The allegedly low cost of solar energy comes from ignoring battery cost.

Can also solve the same issues with other storage options, hedge with other renewables that are also cheap, and electricity transport is a political issue rather than an economic or engineering problem.

> And the relatively high cost of fission comes largely from overregulation.

The low cost of fission is it being subsidised by the military value.

The regulations exist because even though the mean cost of accidents per TWh is tiny, the upper bound of the damage when they do happen is sufficient to bankrupt superpowers.



Is there any reason for this to be less heavily regulated than fission plants/waste? Seems like that’s the major driver of cost? Are ridiculously massive containment vessels less necessary, since it’s very off by default/without power input, rather than potentially very on?


Yes. See https://adamswebsearch2.nrc.gov/webSearch2/main.jsp?Accessio... "Rulemaking: Regulatory Framework for Fusion Energy Systems, NRC Public Meeting, July 12, 2023" and especially the graph on page 27, "Relative radiotoxicity" vs "Time after shutdown (years)"

Also see https://www.nrc.gov/docs/ML2325/ML23258A145.pdf "Fusion Systems Rulemaking: United States Nuclear Regulatory Commission Preliminary Proposed Rule Language". The NRC are already proposing to regulate fusion reactors much more like particle accelerators (which can produce medical radioisotopes, for example), and less like fission reactors.



Awesome, thanks!


Not much real reason for this to be less regulated. Still probably will be in practice, as fission is massively overregulated.


For all the pearl clutching about nuclear waste, dealing with it contributes very little to the cost of fission energy.


... which is considerably more expensive than solar + hydrogen.


Hydrogen for power storage is hardly used in practice, so it is probably quite expensive. Overall, solar energy seems very expensive when you factor in storage cost. Germany has very high energy prices, likely related to switching off fission power plants and heavily investing in intermittent energy sources like wind and solar.


Likely related to? As in you pulled this idea from thin air and pop science reading? Electricity prices are mostly political. Fossil has been heavily subsidised throughout its lifetime, as has nuclear. French prices are lower because their energy producer has 70 billion in debts, backed by the French state, in order not to pass the true price of nuclear on to consumers.

Hydrogen is not used yet, because we don't have enough renewables to have significant overproduction. For now storage is not needed.

Truly this is a topic where everyone feels comfortable to throw in strong opinions without having even the slightest idea of how any of it works.



Yes, electricity prices are political when you include disabling existing fission plants that are very cheap to operate (most of the cost comes from building them). Germany also has heavily subsidized renewable energy, but electricity prices are nonetheless so high that some major industry actually left the country.


Sorry but bullshit. Electricity prices are made up of taxes, subsidies and everything in between.

There is no causal relationship between high share of renewables and high electricity prices in Germany. If you have any argument or evidence that supports the causal link, please share. If not please acknowledge that you are operating on gut feeling without a factual basis.

Wholesale prices for electricity have been higher in France than in Germany, they have been much lower in Sweden, which has more renewables than Germany in the system already.



> only needs hydrogen atoms for fuel rather than rare and dangerous elements like uranium and plutonium

hydrogen isn’t exactly the safest element. not that any element is truly safe if used in the wrong way.



So he's pushing an energy solution that requires Tritium. One of the rarest elements of earth. And then has the nerve to accuse fission of relying on rare elements.


Why is it rare? Because it decays.

Why would anyone make a plan to use it as a fuel without a plan to generate it? They don't. You breed it in the blanket that collects the energy. The energy balance is quite positive.



And then, sometimes, you make it into keychains.

https://www.tecaccessories.com/collections/tritium-isotope-f...



Who is he?

Where did he push tritium?

Is the word tritium in the article? (Chrome can't find it.)

Is hydrogen more common than uranium?

Is hydrogen more common than plutonium?

What quote demonstrates "accuse", as in, "they had the nerve to accuse"?

These are honest questions, I hope you don't take offense.

Full disclosure: I live in Cambridge, MA, where MIT is located.



The only fusion pathway being seriously studied for energy in the short (read: next century) term is D-T fusion, it doesn't need to be stated.

Deuterium and Tritium are the fuels, they have to be. There are lots of other pathways, but they all require even higher temperatures/pressure than D-T, and we can't really sustain even D-T for any useful length of time.



Fortunately, T is a byproduct of DD fusion.

Unfortunately, as we don't have any commercially viable fusion reactors yet, we can't even guess if we can afford to make T in the future.



This is not true. Helion's very clever scheme appears to offer a near term route to using DD and D3He fusion. Helion could also burn DT, but they don't want to due to its practical drawbacks.

If you didn't know this you need to pay much more attention to them. IMO, they are the least dubious of all the fusion efforts at reaching a practical reactor, and that includes the DT efforts.

Helion is also going to produce prodigious amounts of excess tritium if their scheme works, which is good news for others trying the DT path (although how they compete against a workable advanced fuel reactor is not clear.)



How is it unsafe? I just electroplated on my desk and hydrogen was formed (right next to pure o2) with zero danger. Sure, an accumulated mix near stoich might be an issue, but that is easy to avoid.




and you can split water and make neat bubbles of two elements. more likely to shock yourself than make a boom.

most elements are safe if you handle them properly. that assumes nothing outside of your control. and no one.



Why? MIT could just rein these people in, and the reputational damage would stop right away.


> "Overnight, it basically changed the cost per watt of a fusion reactor by a factor of almost 40 in one day,"

So there is this myth that fusion enables unlimited cheap energy. This is 100% false. It's not unlimited at all. Most concepts relies on using lithium-6. That's not a super common element.

But more importantly, it's not cheap. That should be obvious, if you could reduce the cost per watt 40x and still not be competitive.



Lithium is a super common element. They are mining tens of thousands of tons per year increasing rapidly as make electric car batteries. Lithium-6 is 5% of lithium. Fusion reactors only need tons for breeding blanket.


The original ARC design had 950 tonnes of FLiBe, which contains 115 tonnes of Li-6. This is a substantial fraction of the Li-6 produced for the entire US hydrogen bomb program, all for a single 190 MW(e) (net) reactor.


115 tonnes of Li-6 is what you get by sifting through the lithium required for ~200k electric cars.

Tesla alone could provide material for nine such reactors annually.



Sure, the input feed is not the problem. The problem is actually separating it. The technology used for the H-bombs is no longer acceptable due to mercury pollution. World Li isotope separation capacity currently is in kilograms per year (to produce pure Li-7 for pH control in fission reactors.)

And then there's the beryllium. A single one of those ARC reactors would use something like 40% of the current world annual beryllium production.







Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com