News: 1752751570

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Quantum code breaking? You'd get further with an 8-bit computer, an abacus, and a dog

(2025/07/17)


The US National Institute for Standards and Technology (NIST) has been pushing for the development of post-quantum cryptographic algorithms since 2016.

"If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use," NIST [1]explains in its summary of Post-Quantum Cryptography (PQC).

Peter Gutmann, a professor of computer science at the University of Auckland New Zealand, thinks PQC is bollocks – "nonsense" for our American readers – and said as much in a 2024 [2]presentation [PDF], "Why Quantum Cryptanalysis is Bollocks."

[3]

Gutmann's argument is simple: to this day, quantum computers – which he regards as "physics experiments" rather than pending products – haven't managed to factor any number greater than 21 without cheating.

[4]

[5]

Quantum computers and PQC are both enormously complex. But the common process for cracking [6]RSA public key encryption is relatively straightforward. Given a key pair modulus n, you have to factor it into two prime numbers, p and q, such that n = p * q.

When n is 21, p could be 3 and q could be 7, for a not very secure 5-bit RSA implementation. But when n is a 1,024-bit or 2,048-bit number, finding two prime factors requires a tremendous amount of computational power.

[7]

NIST's concern, raised by many computer scientists, is that a quantum computer might one day be capable of running [8]Shor's algorithm , which is essentially a shortcut to find the prime factors of a large integer. Were that to happen, data protected by insufficiently complex encryption keys would be at risk of exposure. To avoid that purported possibility, the US standards organization has been shepherding the development of various quantum-resistant encryption algorithms, such as HQC (Hamming Quasi-Cyclic), CRYSTALS-Kyber, CRYSTALS-Dilithium, Sphincs+, and FALCON. These are being positioned as replacements for current algorithms like RSA.

NIST, which did not immediately have someone available for comment, says some engineers claim quantum code-cracking could be a thing within two decades. Since that's about as long as the deployment of modern public key infrastructure has taken, the standards body argues it's time to start preparing.

Judging by the present state of quantum computers, the case for new algorithms looks less compelling. In a tongue-in-cheek [9]paper [PDF] released in March, Gutmann and co-author Stephan Neuhaus, lecturer of computer science at HS Bund in Brühl, Germany, argue that it's possible to replicate the code-cracking capabilities of current quantum computers with a VIC-20 8-bit home computer from 1981, an abacus, and a dog.

PQC...isn't mathematics or engineering, it's augury: 'A great machine shall arise, and it will cast aside all existing cryptography, there shall be Famine, Plague, War, and a long arable field.'

The paper notes that [10]IBM in 2001 implemented Shor's algorithm in a seven-qubit quantum computer, demonstrating the factorization of the number 15. A decade later, researchers managed to use a quantum computer [11]to factor the number 21 . IBM [12]tried to factor 35 in 2019 [PDF] but basically failed – the algorithm worked 14 percent of the time due to rampant qubit errors.

Researchers affiliated with Shanghai University claim to have used a quantum computer from D-Wave, which specializes in quantum annealing computers tuned for specific optimization problems, to have [13]factored a 2,048-bit RSA integer .

[14]

But according to Gutmann and Neuhaus, the RSA number evaluated was the product of two prime factors that were too close together.

As with a parlor magician's card deck that's been stacked for a card trick, the computer scientists explain in their paper: "Quantum factorization is performed using sleight-of-hand numbers that have been selected to make them very easy to factorize using a physics experiment and, by extension, a VIC-20, an abacus, and a dog."

"Since n (public key) = p * q, the square root of n will give you p and q to within one or two bits if there's only one or two bits difference between them," Gutmann told The Register . "This is why standards for RSA, like FIPS 186 which the paper references, require that they differ by at least 100 bits, i.e. that they're 2^100 (1.3 x 10^30, which Google tells me is called a nonillion) or more apart so you can't get an approximation to them using a square root operation."

An analog in the AI world would be touting the benchmark testing prowess of an AI model [15]trained on the questions in benchmark tests.

Trevor Lanting, chief development officer at D-Wave, told The Register : "Based on our assessment, this research does not represent a new fundamental breakthrough in capability, it's an exploration of some previous work in using annealing QC to factor small numbers. The research explores factoring capability, which we've long said is a problem set that both annealing and gate model quantum systems could address.

"Breaking modern encryption would require quantum processors many orders of magnitude larger than today's scale: there will be no threat to encryption for many years. Moreover, there are post-quantum encryption protocols available. D-Wave does not specifically focus on cryptography, but our technology has been used to power intrusion and threat detection applications."

Amid claims of " [16]quantum supremacy " by Google, Microsoft's disputed [17]Majorana breakthrough , University of Illinois computer science professor Daniel Bernstein's arguments that quantum computers [18]shouldn't be written off , and University of Texas computer scientist Scott Aaronson's view that quantum computing " [19]is on the threshold of becoming real ," Gutmann remains skeptical that anyone will be doing any meaningful code cracking with "physics experiments" any time soon.

[20]Ex-OpenAI engineer pulls the curtain back on a chaotic hot mess

[21]Tech to protect images against AI scrapers can be beaten, researchers show

[22]Jack Dorsey floats specs for decentralized messaging app that uses Bluetooth

[23]China claims breakthroughs in classical and quantum computers

The Register asked Gutmann to elaborate on when the implausibility of quantum cryptanalysis became apparent.

"There wasn't really any specific time, although it has become more and more obvious over time, with the failure of any genuine (non-sleight-of-hand) quantum cryptanalysis to appear, that it's about as real as fusion-powered electricity too cheap to meter and all the other decades-old tech pipe dreams," Gutmann explained.

"I'm an empirical gnostic, and with standard crypto that works well, it's based on mathematics and engineering, we can look at the math and look at the engineering (computing power and so on) and draw a line through the data points on a graph and say 'this will be good until about this time.'

"PQC on the other hand isn't mathematics or engineering, it's augury: 'A great machine shall arise, and it will cast aside all existing cryptography, there shall be Famine, Plague, War, and a long arable field.'

"The Bollocks talk drew a line through the two PQC data points we have which indicate that we'd get to the same level of code breaking that we have today with standard computers in about 2,000 years' time, but even those data points are from sleight-of-hand factorizations, not legitimate applications of Shor's algorithm to recover two unknown factors as needed to break RSA (this is why in the paper we suggest evaluation criteria for quantum cryptanalysis claims that should be resistant to sleight-of-hand tricks).

"In practice we have zero data points, which is pretty good evidence that we're not getting anywhere with physics experiment-based cryptanalysis."

Gutmann added as an aside that we also have the same number of data points for faster-than-light space travel, Star Trek-style transporters, and any number of other high-tech dreams.

Asked whether his skepticism of PQC extends to quantum computers in general, Gutmann said [24]The Australian Strategic Policy Institute addressed the matter better than he could:

Contrary to popular claims, quantum algorithms don't 'try all solutions at once.' Instead, they carefully manipulate qubits to amplify the probability of measuring a useful answer at the output, while suppressing the probability of measuring any other answer. It's roughly similar to a magician's card trick: rather than checking every card in the deck for the one that was chosen, the magician uses a clever sequence of steps to make the chosen card more likely to appear without actually knowing what it is.

Compared to classical algorithms, quantum algorithms can more effectively scale with problem size. This means that for a sufficiently large problem, they may be able to compute an answer more efficiently – in time, energy or cost – than classical alternatives. For smaller problems, it is likely that classical computers will retain a clear comparative advantage for the foreseeable future due to the overheads in quantum computing.

We inquired how IT security professionals should interpret the move to "Post-Quantum Cryptography" and whether there's anything to be gained from the transition.

"No, in fact there's a lot to be lost," Gutmann replied. "We currently have a multibillion-dollar global cybercrime industry that's built on the failure of encryption to provide the protection that it's supposed to, and instead of fixing that problem we're investing a vast amount of effort into swapping out our crypto for new stuff that's inefficient and difficult to work with and that offers no more protection than the old stuff (there's a reason why it had been ignored for decades before quantum cryptanalysis came along to give it a reason to exist, it's really not very practical or usable). Switching to PQC is just a distraction from having to fix the actual hard problem." ®

Get our [25]Tech Resources



[1] https://csrc.nist.gov/projects/post-quantum-cryptography

[2] https://www.cs.auckland.ac.nz/~pgut001/pubs/bollocks.pdf

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/research&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aHkeE4RtTnfeOESlbTdKNAAAAcg&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/research&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aHkeE4RtTnfeOESlbTdKNAAAAcg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/research&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aHkeE4RtTnfeOESlbTdKNAAAAcg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[6] https://cryptobook.nakov.com/asymmetric-key-ciphers/the-rsa-cryptosystem-concepts

[7] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/research&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aHkeE4RtTnfeOESlbTdKNAAAAcg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[8] https://en.wikipedia.org/w/index.php?title=Shor%27s_algorithm&oldid=1298350007

[9] https://eprint.iacr.org/2025/1237.pdf

[10] https://www.ibm.com/quantum/blog/factor-15-shors-algorithm

[11] https://arxiv.org/abs/1111.4147

[12] https://arxiv.org/pdf/1903.00768

[13] https://www.sciopen.com/article/10.26599/TST.2024.9010028

[14] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_security/research&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aHkeE4RtTnfeOESlbTdKNAAAAcg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[15] https://www.theatlantic.com/technology/archive/2025/03/chatbots-benchmark-tests/681929/

[16] https://www.theregister.com/2019/10/22/ibm_poopoos_google_quantum_claims/

[17] https://www.theregister.com/2025/03/12/microsoft_majorana_quantum_claims_overshadowed/

[18] https://blog.cr.yp.to/20250118-flight.html

[19] https://scottaaronson.blog/?p=8329

[20] https://www.theregister.com/2025/07/16/exopenai_engineers/

[21] https://www.theregister.com/2025/07/11/defenses_against_ai_scrapers_beaten/

[22] https://www.theregister.com/2025/07/08/jack_dorsey_debuts_bitchat/

[23] https://www.theregister.com/2025/06/30/china_claims_breakthroughs_in_classical/

[24] https://www.aspistrategist.org.au/horses-for-courses-where-quantum-computing-is-and-isnt-the-answer/

[25] https://whitepapers.theregister.com/



Wonderful paper

SVD_NL

While it's not my field of research, so i can't comment on the scientific validity, that paper is the best laugh i've had in a while! A+ satire.

> "Finally, we refer to a dog as “a dog” because even the most strenuous mental gymnastics can’t really make it sound like it’s a computer."

Also in the footnotes:

> "We use the UK form “factorise” here in place of the US variants “factorize” or “factor” in order to avoid the 40% tariff on the US term"

Re: Wonderful paper

b0llchit

Now I'm really wondering how much El Reg is paying in tariffs all the time with its business importing and exporting letters and words.

Bullshit

Jedit

I grow increasingly convinced that every new development in computing is bullshit. People don't want to advance anything, they just want to find the exciting new buzzword technology or application that will let them be on the ground floor this time and make the big money. It's been going on since the dotcom bubble. Well, since the invention of money, probably, that being human nature, but that's the point where computer tech development seemed to switch focus from solving existing problems to creating "solutions".

Perhaps I'm just getting old and cynical.

Re: Bullshit

NoneSuch

> But when n is a 1,024-bit or 2,048-bit number, finding two prime factors requires a tremendous amount of computational power.

Which is why the NSA has more computing power then Twitter, Facebook and Amazon combined.

Parkinson's Third Law of Computing: Encryption can only delay access to information.

Re: Bullshit

Wellyboot

Indeed, any electronic based system has an indefinite secure lifetime before mere brute force will rip it apart.

Any sensible 1 bods trying to keep things secret from TLAs will use multiple layers, book code 2 buried in gibberish 3 and then send it via parcel mail as part of the packing for some bit of tat lost in the vast volume of tat flowing from a really big river... Until the TLAs find one part of the chain they can follow and you’re toast.

Physical things requires physical intervention. until scanning gets to the resolution needed for reading a closed book via the microscopic differences in signal return between two different visual appearances.

1 Luckily not that common among fanatics.

2 trawl 2 nd hand bookshops for out of print 1950s pulp, & duplicate

3 see 2 preferably in foreign language

did I hear a helecop...

Re: Bullshit

Rich 2

I’m with you. it seems that there has been almost no genuine innovation or fresh ideas in the computing world (well, certainly in the software world) for decades.

And the “innovation” that there has been seems to have the sole purpose of making money (no surprise there), making everything 1000% more complicated than it used to be (I’m thinking about most of the network and user management tools. Oh, and System Dunce), pointless (let’s make a shiny new “skin” for [add application of choice. And System Dunce]) or just plain shit (anything that MS vomit out. Oh and…).

I appreciate I’m being nostalgic here but the days of Early PCs, Apple IIs, and the countless “home” computers around the same time seemed to show massively more invention and clever thinking than anything else over the last 20 years - they were genuinely exciting times. And all without requiring gigabytes of memory to print “hello world”

Re: Bullshit

Ace2

Yeah, well, I can netboot a server I’ve never even seen to install a new OS image, whether or not the current one panics on boot. Some things have improved 10X.

Re: Bullshit

Al fazed

I agree with you all the way to the end, where I'd like to add:

it seems now that they are also creating the problems.

They have perhaps discovered the UK's (secret) Economic Policy. That there's more money to be made fighting ghosts in the future machine, by selling solutions to problems which don't yet exist.

Really clever shit........

ALF

"it seems now that they are also creating the problems."

Jedit

Well, yes. If your entire business model is selling ninja repellent, then your business will collapse as soon as people realise there aren't any ninjas. To avert this, there are three paths:

1) Convince people that the repellent is working really well and that's why they haven't seen any ninjas.

2) Convince people that they haven't seen any ninjas because ninjas are extremely stealthy and that's why you need the repellent.

3) Found and train a ninja clan and kill a few doubters to set an example.

However convincing you are, sooner or later 1 and 2 will both lead to your market realising that they're being fooled. By which time you'd better be far, far away. The typical conman recognises this. But many techbros fall for their own scam and believe they actually can make a working product. And when they can't, they have to try creating a problem that it can fix.

Re: "it seems now that they are also creating the problems."

Guy de Loimbard

Good sir/madam, please have a pint for your japery ==>

I do love a well constructed and artistic piss take of the tech bro stack we're surrounded by now.

As for ninjas, where did I leave my sword, shuriken and sneaky attire????

Re: Bullshit

Tron

The UK doesn't have an economic policy. They are just winging it, week to week.

Add to Quantum code breaking, nuclear fission, AI that is really AI and honest politicians.

And yes, we haven't seen much innovation in computing since the blu-ray disk and 3D printer. Everything else has just been a speed/bandwidth bump. We could be using 15 year old computers if GAFA didn't force OS upgrades on us, and for anyone daft enough to subscribe to SaaS webware, 25 year old systems or dumb terminals.

When tech companies get to a certain point, the innovators cash in and put their feet up. They are replaced by off-the-shelf corporate types who want a quiet life and a big bonus. Innovation dies. Companies are led by lawyers. Maintaining the status quo is just easier. Hence we haven't made the next gen shift to distributed systems, where we could have had an entirely new computing revolution.

What we are using now is just getting worse. If anyone out there has a few quid, we could go back to the basics of computing and rebuild - a new OS, simpler and more resilient, legacy file format compatibility for uptake, protection from hacking built in by restricting access to system internals from networks, and none of the walled garden crap that GAFA use as a weapon. There would be a world of innovation and new stuff that we could build, including distributed software, distributed DNS, distributed social media and other services, ad hoc networking, and the like. Clustering on retail systems with plug-in processor boards. Self-adaptive processors.

Re: Bullshit

Long John Silver

Wise words, indeed.

Taking forward the theme of corporate sluggishness and complacency after the entrepreneurs have left (e.g. BOEING, Microsoft, Apple, and Intel), suggests rethinking the respective roles of market-capitalism and of public/private ownership.

For instance, Richard Wolff (the prominent Marxist economist), someone to whom I listen with interest, advocates industry and commerce being placed into ownership by workers' co-operatives. I profoundly disagree with that, if it is posed as a blanket solution to many present day ills.

However, I consider it a sensible imposition on companies which have passed their stage of innovative fervour, and now mostly churn out 'widgets' (or software) into well-established markets. The supposed 'democracy' of co-operatives works in that context, despite 'democracy' overall being an agency for maintaining an unimaginative status quo in the interests of powerful, mainly self-seeking, individuals.

Yet, we must look to imaginative entrepreneurs, people willing to place their own skin in the game, to seek new opportunities for productive enterprise; albeit under well-regulated market-economics instead of its debased current ethos enshrined in neoliberalism and the psychotic thinking of the late Ayn Rand.

Taking further the matters raised in your final paragraph, yes, there is the potential for global intellectual/commercial renaissance. A key enabling factor will be the removal of ideas, and the means to further them in practical applications, from walled-gardens: abandon so-called 'intellectual property' (IP) and replace it with entitlement to attribution . The collapse of IP and rentier economics is already in progress: it largely the result of technological advances involving the digital representation of data and their ready access via the Internet. BRICS will hasten the economic revolution. Cottage industries shall replace behemoths, and thrive.

Huh?

Anonymous Coward

But how are we going to convince people to give us gazzillions of dollars to replace all their existing kit and algorithms if there's no burning platform????

- Anonymous Qbit sales manager

Re: Huh?

Caver_Dave

Ask Microsoft - they seem to be managing the replacement requirement very well at the moment.

Re: Huh?

Al fazed

Scare the shit out of them.

That usually works.

It has since Roman times at least.

ALF

Discredited Gutmann

Anonymous Coward

So another paper from the same Peter Gutmann who back in the days already discredited himself by writing papers full of nonsense about Windows Vista, Microsoft's upcoming Windows version at that time.

And he subsequently admitted he did so without even using the specific operating system in question himself, despite it being available at least as public beta version at that time.

I'm sure there is much to criticise with all the claims around quantum computing, but I'm not sure a borderline fraudster is the best person to do this.

Re: Discredited Gutmann

Anonymous Coward

Harsh. You may be more familiar with his work than I am, but I found his paper (essay? collection of notes?) on X.509 in practice very useful.

I’m anon because although we recently implemented PQC in a product and we’re hoping for a press release from industry partners fairly soon, so I have a dog on this fight. But I don’t disagree with him.

PQC is fairly easy for an engineer; we found ML-DSA a drop in replacement for RSA or EC, albeit with a much bigger signature size. But it’s not a magic bullet and a lot of the things I would like to focus on are the deeply unsexy parts of crypto; plugging holes in implementation and process. It’s harder, less rewarding but vastly more useful than PQC on the short term and mid term. But we can’t ignore the long term either, so PQC research needs doing too.

Re: Discredited Gutmann

Al fazed

But if the adversary is already on the machine, software or hardware, any type of encryption is not going to provide the secrecy you are looking for.

Stenography is the only thing that I have found which actually works, as long as you don't give the recipient the info via eMail or text, etc.

ALF

Re: Discredited Gutmann

Androgynous Cupboard

Whatever use case you have for steganography would probably be fun to describe, but not useful in practice. It’s not going to help you secure comms with your bank or digitally sign a contract.

Re: Discredited Gutmann

Throatwarbler Mangrove

Stenography or steganography? While hand coded ciphers using a one time pad would evade electronic decoding, it's rather time consuming for long messages. Conversely, I expect that steganography is bandwidth intensive. I guess you could combine the techniques and employ an animation studio to produce your encrypted messages.

Re: Discredited Gutmann

ChoHag

Hmmm... Published paper, or AC?

--->

Points too often forgotten?

Long John Silver

Encryption is reversible obfuscation .

Given enough time, computing power, and patience, a pathway back to the original message (a sequence of binary digits straightforwardly representable as comprehensible text and/or images) will be found. This may take minutes or millennia. Unless the coding method, e.g. RSA public key encryption, is known, or drawn from a set of documented possibilities, each of which may be tried in sequence or on parallel machines, there is no obvious stopping rule for an attempted decryption other than obtaining intelligible text/images. Therefore, 'brute-force' decryption methods, necessary when the underlying logic of encryption is not pre-specified, require step-by-step scrutiny of results until the output is recognisable.

'Brute-force' decryption entails human or algorithmic (e.g. an 'AI') input at every stage or, equivalently, scanning a long sequence of results. Even should clear text emerge, its import may be obscure because the sender of the message could have assumed the recipient had prior knowledge of context and intention.

Also, regardless of the encryption algorithm - public or bespoke - there remains the time period during which the decrypted message retains practical or evidential use. For instance, on the battlefield, transmitted orders may require secrecy for minutes or hours, nothing more when the 'cat is out of the bag'; the fact of an encrypted, but not quickly decipherable, message being intercepted is the only guide for action by its unintended recipients. For other state and for commercial secrets, the essential timescale for secrecy may be longer but, even so, it's not indefinite; that is, unless the reputations of the people deploying secrecy will be at stake: perhaps a political lifespan or two.

These considerations lead to the not often mentioned application of cost-utility and opportunity-cost for individuals and institutions engaged in eavesdropping.

Re: Points too often forgotten?

Guy de Loimbard

Very well put Long John Silver.

Encryption is useful for a period of time and that is, as your rightly say, variable by application.

Having worked in various scenarios that utilize and require the use of cryptographic tools to secure information or data, the hardest thing to really measure, is the value of the information or data, or the expected loss, or impact, if the data or information was decrypted, therefore providing a scale to measure how far up the scale the chosen crypto standard needs to be.

So, as always in government type applications and scenarios, just whack it up to the top setting, even if the info is only valuable for a few hours.

Re: Points too often forgotten?

LVPC

>> unless the reputations of the people deploying secrecy will be at stake: perhaps a political lifespan or two.

Today's politicians don't have any shame, that's just for us little people, because we're not powerful enough to get away with lies.

The giraffe you thought you offended last week is willing to be nuzzled today.