
2026-03-03
In this #KEYMASTER session, Sven Rajala and Chief PKI Officer Tomas Gustavsson discuss why standards and interoperability are foundational—not just for cryptography, but for the modern world as a whole.
From transferring funds across borders to securing online communications, global systems only function because they follow shared rules. In cryptography, those rules ensure that products, vendors, and protocols can work together securely and reliably.
As organizations accelerate their move toward post-quantum cryptography (PQC), these principles matter more than ever.
As Tomas points out, standards and interoperability are so fundamental that we rarely notice them—until they’re missing. Without them, basic global operations like transferring funds between countries or securing online communications simply wouldn’t function.
In cryptography, standards ensure that different vendors, systems, and protocols can work together safely and predictably.
One of the most important concepts discussed is crypto agility. While the term is often misunderstood, it doesn’t mean you can magically flip a switch and start using a new algorithm overnight. Instead, crypto agility is about designing systems that can evolve.
Protocols like TLS are good examples: during the handshake, clients and servers negotiate which cryptographic algorithms they both support. This allows systems to be upgraded gradually, maintaining backward compatibility and avoiding disruptive, ecosystem-wide changes.
In short, future-ready cryptography is about smooth migration, not sudden replacement.
One of the biggest early concerns around PQC was performance. Many expected significant slowdowns compared to classical algorithms like RSA.
In practice, testing has shown something different: for most use cases, performance has been very good—and in many cases better than RSA.
Of course, performance depends on the specific environment and use case. But overall, fears of widespread performance degradation have largely proven unfounded.
As organizations transition to post-quantum cryptography, the biggest adjustments are not just technical—they’re architectural.
PQC introduces several foundational shifts that require teams to rethink how cryptographic systems are structured and integrated. Here are a few examples:
Algorithms Are More Specialized: With RSA, one algorithm could handle encryption, key exchange, and digital signatures. In the PQC world, responsibilities are split between algorithms, with ML-KEM being designed for key exchange, while ML-DSA handles digital signatures. This separation means systems built around the assumption that “one algorithm does everything” need to be restructured.
Established Workflows May Not Map Directly: Many existing standards and processes were built around RSA’s flexibility. Certificate signing requests, protocol designs, and internal security architectures may require updates to align with the more specialized PQC model.
Signing Behavior Is Different: Classical approaches typically hash large data first and then sign the hash. Many PQC signature schemes instead operate directly on the input data. This can introduce size considerations and compatibility constraints—particularly for hardware security modules (HSMs)—and may require implementation adjustments or updated standards support.
The urgency of migrating to quantum-resistant cryptography has accelerated standardization efforts. While this speed is necessary, it also introduces risk. Some PQC mechanisms are already in large-scale production use even though the underlying standards are not fully finalized.
This likely won’t cause major issues—but it’s a risk worth acknowledging. Thorough interoperability testing and transparent documentation become even more critical in this environment.
As cryptography enters a post-quantum era, the real challenge isn’t just adopting new algorithms—it’s ensuring they integrate seamlessly into the systems the world already depends on.

