One year ago, the release of NIST’s post-quantum cryptography (PQC) standards was a milestone in the world of cybersecurity. It was the result of years of collaboration between cryptographers, engineers, and industry leaders — and it gave organizations a clear foundation to start securing digital infrastructure against quantum-era threats.
However, developing robust standards was just the beginning. The implementation phase is now fully underway, and, with a clear 10-year transition timeline in place, the focus is now shifting to a critical question: What does good PQC actually look like in practice?
Implementation is everything
Axel Poschmann, VP Product explains – “As robust as these standards are, they are only as strong as their implementation.”
While NIST’s selection of algorithms (such as ML-KEM and ML-DSA) provides a solid foundation, the way these algorithms are integrated into real-world products will ultimately determine their effectiveness.
For PQC, it’s not a like-for-like swap of RSA or ECC
There’s no universal blueprint for implementation — PQC needs to be carefully tailored to suit the device, the use case, and the data it is securing.
Why edge cases matter
One of the most insightful ways to understand what ‘good’ PQC looks like is to observe extreme edge cases — such as:
- Ultra-small embedded devices
- Ultra-fast networking modules
These use cases present some of the toughest PQC challenges:
- Size constraints: Devices like IoT sensors or smart cards have minimal processing power and memory.
- Speed requirements: Network appliances demand cryptography that doesn’t compromise performance.
- Longevity: Many of these devices can have long life cycles (often 10+ years) making them vulnerable if PQC isn’t built-in from the outset.
Solving PQC implementation for these edge cases isn’t just about tackling the hardest problems; it sets a precedent for efficient, scalable, and secure solutions across the entire digital supply chain.
2035 is not that far away
With NIST’s 2035 deadline for RSA and ECC phase-out firmly in place, organizations can no longer afford to treat PQC as a future problem. In the coming year, we expect to see:
- IT decision-makers shifting priorities, ensuring PQC is foundational for any new product or procurement process.
- A growing demand for tailored, high-performance PQC implementations across sectors like finance, telecoms, automotive, defence, and critical infrastructure.
- A heightened focus on crypto agility to future-proof systems against evolving threats and standards.
The time to scale is now
PQC is no longer confined to academic discussions, or future-facing R&D labs. With parts of our cybersecurity infrastructure already PQC-enabled (for example, browsers, messaging apps, video conferencing tools) interoperability is fast becoming a pressing issue.
The next challenge is scaling PQC adoption across the complex, multi-layered digital supply chain — from hardware and firmware, to cloud services and embedded systems.
Those who lead the way will:
- Ensure compliance and future interoperability
- Minimize long-term costs and integration challenges
- Position themselves as trusted suppliers in a post-quantum world
So what does good PQC look like?
Good PQC is robust, agile, and fit for purpose. It’s embedded in products and systems from the ground up, not bolted on as an afterthought. It considers the unique constraints of each use case and ensures that cryptographic strength doesn’t come at the expense of performance, size, or scalability.
At PQShield, we’re partnering with organizations to deliver exactly that — real-world, standards-compliant PQC implementations that are ready for the demands of the quantum era.
Is your supply chain ready?