July 1st, 2024

Quantum is unimportant to post-quantum

Post-quantum cryptography gains attention for its enhanced safety and flexibility over classical methods. Transitioning to PQ standards addresses risks from potential quantum advancements, aiming to improve cryptographic security proactively.

Read original articleLink Icon
Quantum is unimportant to post-quantum

Post-quantum (PQ) cryptography is gaining attention despite the absence of quantum computers. New PQ standards offer enhanced safety, resilience, and flexibility compared to classical cryptography. While debates on the potential impact of quantum computers persist, the focus on transitioning to post-quantum crypto remains crucial. The current reliance on public-key cryptography faces risks from potential quantum advancements. Over the years, key sizes have increased in response to evolving factoring and discrete logarithm algorithms. Post-quantum cryptography introduces diverse mathematical problems, modern design considerations, and use-case flexibility. These advancements aim to address the shortcomings and vulnerabilities observed in current public-key cryptography. Despite concerns about the uncertainty of new standards, the ongoing NIST post-quantum crypto standardization effort strives to mitigate risks associated with emerging cryptographic technologies. The shift towards post-quantum cryptography signifies a proactive approach to enhancing cryptographic security in the face of evolving threats and technological advancements.

Related

Reconstructing Public Keys from Signatures

Reconstructing Public Keys from Signatures

The blog delves into reconstructing public keys from signatures in cryptographic schemes like ECDSA, RSA, Schnorr, and Dilithium. It highlights challenges, design choices, and security considerations, emphasizing the complexity and importance of robust security measures.

DARPA's military-grade 'quantum laser' will use entangled photons to outshine

DARPA's military-grade 'quantum laser' will use entangled photons to outshine

Researchers are developing a military-grade "quantum laser" funded by DARPA, using entangled photons for a powerful beam penetrating adverse weather. This innovation enhances precision, strength, and performance in challenging environments.

More Memory Safety for Let's Encrypt: Deploying ntpd-rs

More Memory Safety for Let's Encrypt: Deploying ntpd-rs

Let's Encrypt enhances memory safety with ntpd-rs, a secure NTP implementation, part of the Prossimo project. Transitioning to memory-safe alternatives aligns with broader security goals, supported by community and sponsorships.

Confidentiality in the Face of Pervasive Surveillance

Confidentiality in the Face of Pervasive Surveillance

RFC 7624 addresses confidentiality threats post-2013 surveillance revelations. It defines attacker models, vulnerabilities, and encryption's role in protecting against eavesdropping, emphasizing Internet security enhancements against pervasive surveillance.

Latest Breakthrough from String Theory

Latest Breakthrough from String Theory

Researchers introduced a new series from string theory to enhance pi extraction for quantum calculations. Despite media hype, doubts persist about its novelty and practical benefits, urging careful evaluation amid exaggerated coverage.

Link Icon 6 comments
By @adastra22 - 4 months
> But even if a quantum computer is never built, new PQ standards are safer, more resilient, and more flexible than their classical counterparts.

lol wat? Nothing could be further from the truth. Just a few years ago, two of the four finalists of the NIST PQC were broken so badly that you could break production key sizes in hours with a 10 year old Mac mini. Most PQC deployments aren’t broken because they double up with classical crypto systems so that you have to break both, but people who were using these schemes were basically wasting their time.

Key sizes and signatures are typically much, much larger than classical cryptosystems.

PQC systems lack the flexibility of classical cryptography. Generally speaking the linear structure of classical schemes is what quantum computers exploit, and so it is gone from most (all?) PQC systems. That linear structure is what lets you do things like make a 2-of-2 by adding pubkeys and signatures. I’m not sure what is meant by flexibility of not stuff like this.

By @RcouF1uZ4gsC - 4 months
> But even if a quantum computer is never built, new PQ standards are safer, more resilient, and more flexible than their classical counterparts.

I disagree.

First of all they are far more inconvenient. They keynotes and signature sizes are bigger. With Curve25519 ECC we can have 256 bit keys.

Secondly, flexibility is mentioned, but I think the last serval years have shown that flexibility is huge vulnerability in crypto systems.

Finally, I feel pretty confident that the NSA doesn’t know too much more than we do about RSA or ECC which have been studied, studied, implemented, and analyzed for decades in the open. I worry with Post-Quantum algorithms, that there is a good chance that the NSA knows far more about them than does the public cryptography community.

By @tyoma - 4 months
For a long time I wondered why there was such a big push for PQ even though there was no quantum computer and a reasonably working one was always 15 years in the future.

… or was there a quantum computer somewhere and it was just kept hush hush, hence the push for PQ?

The answer turns out to be: it doesn’t matter if there is a quantum computer! The set of PQ algorithms has many other beneficial properties besides quantum resistance.

By @ohxh - 4 months
> These are all special instances of a more general computational problem called the hidden subgroup problem. And quantum computers are good at solving the hidden subgroup problem. They’re really good at it.

I assume they mean the hidden subgroup problem for abelian groups? Later they mention short integer solutions (SIS) and learning with errors (LWE), which by my understanding both rely on the hardness of the shortest vector problem, corresponding to the hidden subgroup problem for some non-abelian groups. I haven't read into this stuff for a while, though

By @ukdghe - 4 months
Considering the question of whether classical methods can break the current breed of secure algorithms is still open I see pqc as a hedge against the possibility that p=np.
By @beloch - 4 months
"Of course, one big concern is that everybody is trying to standardize cryptosystems that are relatively young. What if the industry (or NIST) picks something that’s not secure? What if they pick something that will break tomorrow?"

If information remains interesting to an adversary long-term, they can always archive classically ciphered text and apply future hardware and algorithmic advances to cracking it.

This is why "post-quantum" cryptography may well be "quantum cryptography". QC must be broken at the time of transmission for an adversary to obtain any information at all. If you're trying to communicate something that will remain sensitive long-term, with QC you aren't betting against the future producing something surprising.

QC already works, it's getting cheaper and faster, and more network friendly. It's not ready for the internet yet, but it's getting there. We don't need it for information that changes every few years, like credit card info, but that's not all people use cryptography for, even today.