News: 1773659106

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

UK splashes £45M on AI supercomputer to help crack fusion power

(2026/03/16)


The UK government is splashing out £45 million (c $60 million) on a new AI-driven supercomputer designed to help scientists model the chaotic physics of nuclear fusion, with the system expected to come online this summer at the UK Atomic Energy Authority's (UKAEA) Culham campus.

The machine, called Sunrise, is being pitched as the world's most powerful AI supercomputer that is dedicated specifically to fusion energy research. Funded by the Department for Energy Security and Net Zero (DESNZ), the 1.4MW system is slated to begin operating in June and will form the first major piece of infrastructure in what ministers describe as the UK's planned "AI Growth Zone" at Culham in Oxfordshire.

Fusion research has long relied on large-scale simulations to understand the behavior of superheated plasma and the extreme materials in experimental reactors. The idea behind Sunrise is to combine high-performance computing with physics-informed AI models, allowing researchers to run more detailed simulations and develop digital twins of complex fusion systems before attempting costly physical experiments.

[1]

According to the government, the system will deliver up to 6.76 exaFLOPS of AI-accelerated modeling performance. That figure refers to AI workloads rather than the traditional supercomputing benchmarks used in global rankings, but it still represents a significant increase in modeling capability for the UK's fusion research programs.

[2]

[3]

The machine will incoporate AMD [4]EPYC processors and AMD Instinct GPU accelerators running on Dell PowerEdge infrastructure, with WEKA providing the storage platform. Intel is also supporting the project, alongside the University of Cambridge and the UK Atomic Energy Authority (UKAEA).

Officials say the system will help tackle several key challenges in fusion research, including modeling plasma turbulence, developing reactor materials, and advancing tritium fuel breeding technologies needed for future fusion systems.

[5]

Dr Rob Akers, director of computing programs at the UKAEA, said the system is intended to bring an "Apollo program" style approach to fusion development by allowing researchers to test and refine designs in a virtual environment before building them in the real world.

"Sunrise will bring that capability to fusion by combining high-fidelity simulation with physics-informed AI to develop predictive digital twins that reduce the cost, risk, and time of learning that would otherwise require expensive and time-consuming physical testing," he said.

[6]Imagine there's no AI. It's easy if you try

[7]Trump Media jumps aboard the speculative nuclear fusion bandwagon

[8]AI isn't throttling HPC. It is HPC

[9]Brit boffins teach fusion plasma some manners with 3D magnetic field

The supercomputer will support several UK fusion initiatives, including the LIBRTI program, which focuses on tritium fuel-cycle technologies, and the government's flagship [10]STEP project , a prototype spherical tokamak power plant that Britain hopes to build in Nottinghamshire in the 2040s.

Sunrise also fits into a broader push by the UK government to expand its domestic AI and supercomputing capacity. Earlier this year, ministers [11]confirmed a separate £36 million (c $48 million) investment in the Cambridge supercomputing center, while Culham is expected to become a hub for AI-driven scientific computing tied to energy research.

Whether AI can meaningfully speed up the notoriously slow march toward commercial fusion power remains an open question. For now, the UK is betting that more computing power might help crack one of physics' most stubborn problems a little faster. ®

Get our [12]Tech Resources



[1] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/hpc&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2abg3NZiiVs0dji7xLLHaDwAAAZQ&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/hpc&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44abg3NZiiVs0dji7xLLHaDwAAAZQ&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/hpc&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33abg3NZiiVs0dji7xLLHaDwAAAZQ&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[4] https://www.theregister.com/2026/02/25/amd_edge_sorano/

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_onprem/hpc&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44abg3NZiiVs0dji7xLLHaDwAAAZQ&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[6] https://www.theregister.com/2025/12/29/four_tech_trends_2025/

[7] https://www.theregister.com/2025/12/18/trump_media_group_tae_fusion_firm/

[8] https://www.theregister.com/2025/11/11/ai_hpc_opinion_piece/

[9] https://www.theregister.com/2025/10/21/ukaea_fusion_plasma_magnets/

[10] https://www.theregister.com/2025/09/11/us_fusion_power_funding/

[11] https://www.cam.ac.uk/news/government-funding-boost-for-cambridge-supercomputer

[12] https://whitepapers.theregister.com/



AI will not speed things up

may_i

Useful fusion power will continue to remain 30 years in the future.

Re: AI will not speed things up

I am David Jones

You’re too pessimistic. Just imagine how exciting it would be to live in a world where fusion power continually remains a mere 5 years away!!

Re: AI will not speed things up

Korev

With AI it'll only be three decades away though

Re: AI will not speed things up

Anonymous Coward

Wouldn't it make more sense to help ITERate joint research, rather than Grot Britain going its own silly Brexity way? [1]Oh…

(I suppose at least as a supercomputer centre, Sunrise can (hopefully) be used by and results shared with researchers elsewhere in Europe and beyond, but, given that tokamaks are very much not the cheapest doughnuts in the shop, it seems a bit silly to duplicate that particular effort separately.)

[1] https://en.wikipedia.org/wiki/ITER#cite_note-Sparkes-2024-14

Algorithms

Rich 2

One thing I find odd is the number of times someone takes an existing problem (such as nuclear fusion simulation, for example), then (apparently) just “adds some AI” and the result is immediately “better” (or at least “wow! It’s AI”)

I know stuff-all about nuclear fusion simulation but I bet the (non “AI”) algorithms used have taken many many years (decades) to develop. How come new “AI” algorithms (which are obviously “better”. Obviously) seem to be instantly available? Or at least available as soon as the new “AI” machine has been built? I’m thinking “AI” algorithms must be massively different when compared to the boring old algorithms so how come they seem to get developed and written in an afternoon?

Or is the “AI” bit just …well …total bollocks?

Re: Algorithms

elsergiovolador

I think the idea might be that public sector cannot hire competent people as that would involve massive change of pay scales. So the "hack" is to employ AI that supposedly has better reasoning skills than your average civil servant. Ironically you can see evidence of it by how enthusiastic the big wigs are about adoption of the AI. They see AI slop and they instantly think it is genius.

"Or is the “AI” bit just …well …total bollocks?"

Jedit

AI can be faster for iteration, as it can filter out pointless lines of enquiry rather than simply brute force everything. But that doesn't help to solve an unsolved problem, as without an extant solution it can't determine which lines are pointless.

As ever, AI cannot do anything that a human has not already done.

Re: "Or is the “AI” bit just …well …total bollocks?"

Headley_Grange

Read up on the protein folding that DeepMind did about 6 years ago.

https://www.science.org/doi/10.1126/science.370.6521.1144

When aimed at specific problems like this, with lots of possibilities, iterations and dead ends then AI can be crazy good. Like another OP, I know bugger all about fusion but I bet there are problems of magnetic flux and plasma flow that are too complex for current computation that a suitably trained AI could have a crack at.

Re: Algorithms

Anonymous Coward

I think the idea is focused on using specialized non-LLM AI methods to either [1]reduce the testing space for physical prototypes, or [2]increase the search space for influential parameters over which to optimize a design. In the Sandia case, it seems it helped them dig themselves out of some ' human intuition ' corner they'd painted themselves in earlier (or somesuch), thanks to AI's unbridled irrationality (iiuc) -- ymmv.

[1] https://www.theregister.com/2026/02/08/machine_learning_battery_development/

[2] https://www.theregister.com/2026/01/26/sandia_ai_agents_feature/

Computer

elsergiovolador

It's great that we are buying a computer.

It's bad that we are not making a computer.

Not much "so what" here?

Like a badger

Most of this seems to have come straight from the relevant government press release, and I'm left wondering what £45m buys the taxpayer (other than about 150 metres of HS2)?

So does £45m actually buy any worthwhile scale of super computer?

How does 6.76 exaflops compare to other energy research computing around the world?

Is it likely enough to achieve anything worthwhile?

Or is this just part of a broader "£40m here, £40m there" programme in which our poorly qualified government hand out insignificant sums to pretend that the UK invests in science?

Re: Not much "so what" here?

elsergiovolador

Also computer is one thing, but how are they going to solve PEBCAK.

Andy The Hat

Is 1.4MW particularly beefy for a thunderous, brain stomping, Om-challenging AI entity that's not based on ARM or RISCV power sippers? Sounds a bit wussy to me.

HuBo

Looks like 1.4MW may slot this Sunrise somewhere between [1]Top500's #22 Venado (GH200) with 98 PF/s at 1.7MW and #33 El Dorado (MI300A) with 68 PF/s at 1.1MW. Using AMD GPUs sounds like the right choice to me here, to hedge AI bets by maintaining proper [2]FP64 performance (vs [3]Ozaki ). MI430X in particular would be very nice for this imho.

[1] https://top500.org/lists/top500/list/2025/11/

[2] https://www.hpcwire.com/2026/03/13/amd-hints-at-big-fp64-increases-in-mi430x-gpu-as-ozaki-underwelms/

[3] https://www.theregister.com/2026/01/18/nvidia_fp64_emulation/

Crysis

Roj Blake

Can it run it?

Brit AI will solve this first.

Tron

If they spend enough on this system, it will prove beyond all doubt that it will never be commercially viable.

The Nobel prize for Weltschmerz will be coming to Blighty.

"I teleported home one night
With Ron and Sid and Meg.
Ron stole Meggie's heart away
And I got Sidney's leg."

- A poem about matter transference beams.