“You need to update your code and you need to update your key material.” – Stefan Kölbl
The conversation around post-quantum cryptography often stays at the algorithm layer. Which scheme will be standardized? How large are the keys? What is the performance overhead?
Inside a hyperscale environment, those are necessary questions but they are not the decisive ones.
In this episode of Shielded: The Last Line of Cyber Defense, Stefan Kölbl, Information Security Engineer at Google, explains why migrating to quantum-safe cryptography is fundamentally an operational problem. The real challenge is not selecting an algorithm. It is building an organization capable of changing cryptography safely, repeatedly, and at scale.
Algorithms Are the Easy Part
Stefan breaks migration into two components: updating code and updating key material.
The first problem can often be solved centrally. At Google, developers are not expected to reason about low-level cryptographic primitives. They rely on higher-level libraries and APIs that abstract those decisions away. When designed properly, these abstractions allow cryptographic changes to happen without rewriting large portions of application logic.
“You need to update your code and you need to update your key material… a large part of your company should not have to worry about updating code. This should be managed by one team.”
That architectural discipline reduces algorithm migration to an implementation detail. When PQC standards evolve, the underlying cryptography can change without forcing thousands of engineers to revisit their code.
However, this abstraction only solves half the problem.
Key Rotation Is Where Migration Succeeds or Fails
Once algorithm selection is abstracted, key material becomes the true constraint. Post-quantum migration exposes whether an organization can rotate keys reliably and transparently. If key ownership is unclear, if lifecycle tracking is incomplete, or if rotation is manual and brittle, PQC becomes painful very quickly.
“And what we continuously see is also hard cases in PQC often tied to cases which have bad key management.”
In practice, migration friction is rarely about lattice cryptography. It is about governance, automation, and lifecycle discipline. If keys cannot rotate cleanly today, they will not rotate cleanly tomorrow regardless of which algorithm is selected. Post-quantum cryptography becomes a stress test of crypto hygiene.
PQC as a Security Hygiene Upgrade
One of Stefan’s strongest themes is that PQC should not be treated as a one-time event. Organizations often approach it as a compliance milestone: migrate, certify, declare completion. Stefan argues that this framing misses the larger opportunity.
He says, “We really want to get out also that consider this more also as an opportunity to improve your security hygiene… It’s not like you do this once then you’re done.”
The migration effort can instead be used to institutionalize automated rotation, improve key visibility, centralize cryptographic abstractions, and embed crypto agility into development frameworks. If done well, PQC strengthens resilience far beyond quantum threats. It prepares organizations for future algorithm transitions, deprecations, and emerging risks.
Store Now, Decrypt Later Is Treated as Real
For Google, the threat model driving early migration was not theoretical.
This risk assessment justified early hybrid deployments even before final NIST standards were published. Prior Chrome experiments had already tested post-quantum key exchanges at internet scale. Hybrid cryptography allowed Google to layer quantum-safe mechanisms on top of classical security without degrading protections.
The decision was not driven by panic, it was driven by risk management. When credible threat models align with manageable mitigations, waiting for perfect certainty can introduce more exposure than early deployment.
Encryption in Transit Is Manageable. Signing Is Not.
A critical nuance in the conversation is the difference between encryption and signing. Encryption in transit, especially under TLS 1.3, benefits from built-in agility. Key exchanges are ephemeral. Cipher suites can evolve. Performance overhead has proven manageable in practice. Signing infrastructure, however, presents a different class of problem.
“In some sense we are lucky here that the most highest risk cases, maybe the easiest to patch. While with the signing cases, we see much more problems… firmware signatures where devices have resource constraints… sometimes even keys are burned into hardware so you cannot switch them.”
Firmware signatures, hardware roots of trust, and long-lived embedded devices operate on decade-scale lifecycles. Keys may be physically embedded in silicon. Rotation may require hardware refresh cycles rather than configuration updates.
For many organizations, this long tail of signing infrastructure will define the complexity of their PQC roadmap.
Inventory Is Necessary but Not Sufficient
Most migration journeys begin with inventory. Discover the RSA keys. Identify ECC certificates. Build dashboards.
Stefan acknowledges the value of visibility but he also highlights its limits – “You start like, I want to know where the RSA ECC signing keys. But very quickly you end up with a long list… you need so much more context.”
A list of keys does not tell you which systems are most critical, which keys can rotate easily, or which are tied to hardware constraints. Without lifecycle context and ownership clarity, inventory becomes static reporting rather than actionable intelligence. Agility requires understanding not only where keys exist, but how they behave over time.
Performance Surprises at Scale
Even when planning is thorough, scale introduces unexpected interactions. Stefan shares an example where a post-quantum key exchange implementation caused performance regressions. Not because the algorithm was computationally heavy, but because of how it allocated memory.
The issue was subtle. Benchmarks appeared clean. The algorithm itself was fast. But memory allocation patterns interfered with cache behavior in a specific service environment.
This illustrates a broader truth: cryptography does not operate in isolation. It interacts with system architecture, hardware behavior, and workload patterns. Comprehensive regression testing is not optional in large-scale PQC rollouts. It is foundational.
The Real Goal: Crypto Agility
Stefan’s perspective ultimately converges on a broader strategic objective: crypto agility.
Post-quantum cryptography will not be the last algorithmic transition organizations face. Standards will evolve. New primitives will emerge. Threat models will shift.
Organizations that centralize cryptographic abstractions, automate rotation, and continuously monitor system behavior will be able to adapt without disruption. Those that treat PQC as a narrow upgrade may achieve short-term compliance but remain structurally brittle.
You can hear the full conversation with Stefan Kölbl on Shielded: The Last Line of Cyber Defense, available on Apple Podcasts, Spotify, and YouTube.
About Stefan Kölbl
Stefan Kölbl is an Information Security Engineer at Google, where he has been deeply involved in the company’s internal post-quantum cryptography rollout. His work spans early hybrid deployments, encryption-in-transit migration, key lifecycle management, and performance validation at hyperscale.
Stefan brings an operator-level perspective to quantum-safe migration, focusing on crypto agility, secure-by-default developer frameworks, and scalable key management architecture. His experience includes navigating PQC implementation prior to final NIST standardization and addressing real-world constraints such as signing lifecycles, hardware-bound keys, and system-level performance interactions.

