5 Steps for Implementing the New Post-Quantum Cryptography Standards

Develop new strategies to mitigate quantum risks, map vulnerabilities, prioritize high-value assets, prototype PQC impacts, and enhance cryptographic agility.

4 Min Read
Post Quantum Cryptography and Quantum Resistant Cryptography with a digital key illustration
ArtemisDiana via Alamy Stock

Last month, NIST standardized the first three public key algorithms in its post-quantum cryptography (PQC) suite. This marks a turning point in cybersecurity and opens the door for implementation of this next-generation cryptography. Leaders can begin deploying the new PQC standards through five key steps: 

  1. Plan for change. Cryptography exists across multiple network layers, allowing systems to encrypt and decrypt information in different ways to meet different goals from ensuring confidentiality and integrity to providing mechanisms for authentication and nonrepudiation. Modern cryptography depends on mathematics and the assumption that certain types of problems -- at certain scales -- are infeasible given the limits of today’s computing power. That’s changing with quantum computing. Quantum enables fundamentally different approaches to computing, providing solutions to certain problems at speeds that were believed to be impossible. Quantum computing may drive exciting breakthroughs that benefit healthcare, climate, and materials science, but it also presents an unprecedented threat to many of the cryptographic algorithms at the heart of modern cybersecurity. It’s difficult to understate the scope of the changes introduced by quantum technologies. Leaders should plan for change by developing strategies to mitigate the cybersecurity risks introduced by quantum technologies. 

  2. Understand the attack surface. Despite early warnings that all public key cryptography will be at risk in the quantum era, replacing vulnerable cryptography will not be easy. Most organizations do not have a catalog of where these algorithms exist within their networks, systems, and applications. Over the past few years, several mandates have tried to push federal agencies to develop these critical inventories, including NSM-10, the Quantum Computing Cybersecurity Preparedness Act, and OMB M-23-02. Agencies should look at any existing inventory to ensure it provides a comprehensive foundation for the next steps in the PQC transition. Many may find that more work is needed to strengthen those inventories before moving to the next stage of their PQC journey. 

  3. Prioritize high-value assets. The nature and timeline of the quantum threat to encryption makes prioritization paramount in any PQC transition. That prioritization should consider what many have come to call the “Hold Now, Decrypt Later” threat. “Hold Now, Decrypt Later” refers to adversarial behavior driven by the understanding that it is no longer a question of if quantum computers will break public key encryption; it is a question of when. That premise creates a new sense of urgency around cybersecurity risk because some information must remain secure for decades. If an adversary penetrates a network today, encryption may prevent immediate access to sensitive information, but agencies can no longer assume that defense will stand the test of time. Adversaries can exfiltrate this data and the quantum-vulnerable keys currently protecting it, hold it, and decrypt it later when quantum computers reach sufficient capacity to break public key cryptography. Anything that is exfiltrated from a network before being re-encrypted with PQC can be decrypted in the future, and agencies may not see the national and economic security implications of this attack vector for years. 

  4. Prototype critical applications. NIST has now standardized three public key algorithms in its PQC suite and will soon release draft standards for a fourth. Supported by nearly a decade of testing and validation by the cryptographic community, these four are widely regarded as the best available defense against classical and quantum attacks. But strong algorithms, alone, are not enough. These PQC algorithms must be implemented carefully to avoid opening new attack vectors, such as side channel attacks. They must also be implemented in ways that are sensitive to security and performance tradeoffs. PQC algorithms rely on different mathematics than current cryptography. Those differences make them resistant to quantum attack, but introduce network and infrastructure challenges via increased latency, increased bandwidth, lack of interoperability or backwards compatibility, and more. Prototyping enables the performance and interoperability testing organizations need to confirm that PQC algorithms have been implemented securely in ways that do not threaten system, network, or application availability.  

  5. Design for cryptographic agility. NIST’s initial PQC standards are widely regarded as the best available defense against quantum attacks on public key cryptography. Specifying “available” is important for two reasons. First, it is possible that future advances could one day render one of these algorithms insecure. Second, NIST has announced that they plan to standardize additional PQC algorithms in the future. Additional algorithms will be designed to provide a broader range of options for different use cases as agencies continue to navigate the performance tradeoffs associated with this new class of cryptography. They will also be designed to increase the diversity of underlying computational assumptions within NIST’s PQC suite to mitigate the risks of future quantum threats. Cryptographic agility refers to an institution’s capacity to manage its cryptography today and respond to changes that may be introduced in the future. The transition to PQC will be challenging, but it also introduces opportunities for agencies to increase the agility with which they can navigate cryptographic changes that may be required in the future.  

PQC offers a critical defense against the impending quantum cyber threat, but its implementation won’t be easy. Cybersecurity leaders must begin working toward transition objectives above as soon as possible. 

About the Authors

Jordan Kenyon

Chief Scientist, Booz Allen Hamilton

Jordan Kenyon is a chief scientist at Booz Allen Hamilton and a leader within the quantum portfolio. With over 10 years of experience in strategy and analytics, Jordan has led projects focused on advancing analytics, quantum, and cyber capabilities. She brings skills in strategy, project management, research design, statistics, performance management, organizational science, and emerging technologies. Jordan earned her PhD in Criminology and holds project management certifications from the Project Management Institute (PMP) and International Consortium for Agile (Agile Project and Delivery Management). 

Taylor Brady

Quantum Lead Scientist, Booz Allen Hamilton

Taylor Brady is a quantum leadsScientist at Booz Allen Hamilton specializing in post-quantum cryptography with a strong track record in scaling quantum business. She is experienced in presenting complex technology concepts, product and capability development, and solving open-ended problems, with expertise in AI, computer vision, statistical methods, and data visualization. 

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights