IBM Quantum Computer

How NIST Is Securing The Quantum Era

Quantum computers powerful enough to break the strongest classical encryption are at least a decade away, but the time to develop quantum safe encryption is now. In this opinion piece, Thomas Pöppelmann, a Senior Staff Engineer, Security Architecture and Cryptography Research at Infineon Technologies talks about the steps NIST and companies like Infineon are taking to make that happen.


In the last decade, quantum computing has moved from primarily theoretical to its first practical applications, with low-compute density systems now offered by multiple organizations. As these systems increase in capability, it’s only a matter of time before the most commonly used cryptographic algorithms employed to secure digital systems become vulnerable to attack.  In response, industry, academic and standards setting institutions are developing the concepts for Post-Quantum Cryptography (PQC).

Thomas Pöppelmann
Thomas Pöppelmann is a Senior Staff Engineer at Infineon.

Beginning in 2016, the U.S. National Institute of Standards and Technology (NIST) began a multi-year process to create a framework and standards for PQC. This process is expected to enter a third phase of technical evaluation as early as June 2020, moving towards a target of releasing draft standards in the 2022 – 2024 timeframe. This article summarizes the NIST process to date and provides an overview of Infineon’s efforts related to the development of effective chip-based implementation of PQC techniques.

Episode 186: Certifying Your Smart Home Security with GE and UL

Quantum Algorithm Development 

It’s been recognized since the mid-1990s that the most commonly used asymmetric algorithms for digital security systems will not be able to resist cryptographic attacks by appropriate quantum computers. A quantum-factoring algorithm discussed by Peter Shor* in 1994 was shown to be capable of breaking both RSA- and ECC-based cryptosystems using a sufficiently powerful quantum computer. The theoretical approach was aimed at factoring the available public keys to asymmetric cryptography used in digital signatures and Public Key Encryption (PKI) that secures smart cards, smart phones, computers and servers, industrial control systems and the emergent Internet of Things. Separate work by Lov Grover in 1996 identified an algorithm that could be used to speed up brute force attacks, which led to the doubling of key lengths used in symmetric cryptography, from AES-128 to AES-256.

The challenge today is how to improve asymmetric cryptography before quantum computing capability advances to the point where attacks based on Shor’s algorithm or other advanced approaches become practical.

The use of quantum mechanical phenomena to accelerate processing was a fairly new idea when Shor first published his work. While the number of computing units, or qubits, applied in quantum computers today is still below 100, it is already possible to execute in minutes calculations that a conventional computer might take thousands of years to complete.

Now is the time to develop harder-to-attack encryption schemes that can underlie new standards to secure digital transactions and communications.

Thomas Pöppelmann, Infineon Technologies

It has been estimated that the computing power to break a long key length encryption like RSA-2048 will require about 4,098 reliable and fault tolerant qubits.[1] While reaching the required stable compute power is a task that will likely take at least a decade, this means that now is the time to develop harder-to-attack encryption schemes that can underlie new standards to secure digital transactions and communications.

Firms are embracing Open Source. Securing it? Not so much.

NIST’s Plan to Secure Quantum

The original call for submissions by NIST in 2016 received input from 278 individual submitters from 25 countries spanning six continents.  Researchers submitted schemes that addressed key exchange using lattice-, code- and isogeny-based schemes. Other submissions based on multivariate and symmetric cryptography approaches were submitted for signatures. In December 2017, NIST published 69 submissions it considered “complete and proper” and invited comment from the research community. After 15 months, in April 2018, the “First PQC Standardization Conference” was convened in Ft. Lauderdale, Florida. More than 350 participants attended this meeting and, by the end of the round 1 process the PQC forum community had received more than 1,000 posts, including 300 official comments.

In January 2019, NIST published 26 of the most promising schemes from the original 69 submissions [2], which marked the start of round 2. Authors were invited to revise and/or merge these submissions before the next round of comment and evaluation, which would begin in April 2019. NIST noted that while its main selection criteria was cryptographic strength, considerations also included likely cost and perfo`rmance and the complexity (or more desirably, simplicity) of implementation. A corresponding pqc-hardware forum was opened in the same time frame, with recommendations from NIST that performance evaluations be performed on general purpose CPUS, ARM™ Cortex-M4 based microcontrollers and Artix-7 FPGAs.

Of the 26 submissions in round 2, nine are signature mechanisms and 17 address key encapsulation, as noted in tables 1 and 2. Re-publication of submissions followed in August 2019 with the Second PQC Standardization Conference in Santa Barbara, CA, which was attended by more than 250 participants. In addition to expert presentations, an industry panel held a focused discussion on policy and timeline issues required to introduce PQC into products, barriers to adoption and IP issues.

Significant technical issues discussed at the second conference included ongoing research to the level of confidence in different primitives and mathematical approaches, as well as best methods to standardize key encapsulation schemes. Schemes for key encapsulation generally follow the Chosen-Plaintext Attack (CPA) or Chosen-Ciphertext Attack (CCA) model. The former usually provides security only when key pairs are not reused. Schemes following CPA are typically more robust, but with the tradeoff of higher complexity.  One area not specifically addressed is how hybrid schemes might be implemented, though there is general agreement that PQC is likely to be rolled out on top of existing cryptographic standards.     

SPHINCS+ and New Hope: Infineon’s Activities

Infineon has been active in contributing to the development of two quantum-safe cryptographic schemes that are part of the 26 considered in round 2. These are New Hope, a key-exchange protocol, and SPHINCS+, a stateless hash-based signature scheme.

SPHINCS+development is led by a European University team with additional industry participation. It was originally published as SPHINCS in 2015 and, incorporating feedback and comments, was updated before being submitted in NIST round 1. Improvements over time have reduced signature size to as little as 8 kb in non-optimized form at NIST security level 1 and about 30 kb for the highest NIST security level 5. To provide flexibility within varying parameters, there are three versions of SPHINCS+ that allow for NIST security level 5 at different processing speeds by using varying hash mechanisms. These are:

  • SPHINCS+-SHA3 (using SHAKE256)
  • SPHINCS+-SHA2 (using SHA2)
  • SPHINCS+-Haraka (using the Haraka short-input hash function)

The New Hope scheme is the result of collaboration between McMaster University, Radboud University, several private grants, European Commission project funding and three corporations, ARM, Infineon and NXP. It includes CPA- and CCA-secure key encapsulation mechanisms that target level 1 and level 5 NIST security. Its goal is to replace, or complement in hybrid approaches, current Diffie-Hellman or Elliptic Curve Diffie-Hellman-based key-exchange mechanism. New Hope is a lattice-based cryptography system, meaning that security is based on the difficulty of solving problems related to n-dimensional structures. Table 3 summarizes how lattice systems vary from other methodologies under consideration. The mathematical problems it is based on incorporate Learning With Errors (LWE) and Ring Learning With Errors (RLWE) problems. Use of these types of problems allows for easier construction of the crypto system than geometric lattice systems, resulting in smaller public key, ciphertext and signature lengths.    

Infineon has implemented New Hope on a commercially available contactless security microcontroller to demonstrate proof-of-concept that it can be used on smart card systems with low-memory and card reader power supply. This smart card implementation also demonstrated a migration and hybridization scheme, as the PQC was accelerated using available RSA/ECC co-processors for fast integer multiplication and a speed-up of symmetric functions using the AES accelerator for PRNG and a SHA-256 co-processor for hash functions.

Infineon is also part of the research team for two PQC-related projects funded in part by the German government. The Aquorypt[3] consortium includes university and corporate researchers that are primarily investigating security in industrial embedded systems and in smart cards. PQC4MED[4] focuses on securing embedded systems in medical products and is examining the implementation of both the hardware and the associated software requirements to counter threats such as those posed by quantum computers.

The Next Steps

These final two projects are among several government and EU-funded efforts to prepare for the PQC era. Additionally, the ETSI and ISO standardization bodies are running study groups specifically focused on PQC. For now, NIST is leading the effort in selection of algorithms, reflecting in part the global community’s confidence in the model established by its earlier work in such areas as AES (Advanced Encryption Standard) and the more recent SHA-3 (Secure Hash Algorithm). In both cases, researchers from Belgium were significant contributors to the final standard. With the PQC process, an even larger group of contributors has been activated. Since a variety of approaches are likely to be standardized, international participation also is likely to be broad.

Round 3 in the NIST standardization process is expected to winnow down the 26 round 2 proposals (for signature mechanisms and key encapsulation) to a smaller set of proposed schemes that are ready for standardization. Some of the algorithms also may be identified as too new or unstable to progress to standardization, though they may be recommended for further study and possible future action. With further refinement, the candidate group will be moved to draft standard stage no earlier than 2022, corresponding to the elapsed time for each of the previous stages of the process.

As several decades of history have demonstrated, well-defined and rigorous standards for data security and authentication of computing and communications devices used in networked systems are critical to the digital systems that drive modern economies. In the coming era of quantum computing, achieving security quite literally will require a concurrent quantum leap in the performance and robustness of the next generation of global security standards. 


Signatures In Round 2

Name of signature schemeMathematical ProblemSub-Category
CRYSTALS- DILITHIUMLatticeFiat-Shamir
qTESLALatticeFiat-Shamir
FALCONLatticeHash-then-sign
MQDSSMultivariateFiat-Shamir
LUOVMultivariateUnbalanced Oil and Vinegar (UOV)
RainbowMultivariateUnbalanced Oil and Vinegar (UOV)
GeMSSMultivariateHidden-Field-Equations (HFE)
PicnicSymmetricHash
SPHINCS+SymmetricZero Knowledge Proof (ZKP)

 

Key Encapsulation Mechanisms (KEM) In Round 2

Name of encryption schemeMathematical ProblemSub-Category
CRYSTALS-KYBERLatticeModular Learning with Errors (MLWE)
SABERLatticeModular Learning with Rounding (MLWR)
FrodoKEMLatticeLearning with Errors (LWE)
Round5LatticeLearning with Errors (LWE)/Ring Learning with Errors (RLWE)
LACLatticeRing Learning with Errors (RLWE)
New HopeLatticeRing Learning with Errors (RLWE)
Three BearsLatticeInteger Modular Learning with Errors (IMLWE)
NTRULatticeNTRU
NTRU PrimeLatticeNTRU
Classic McElieceCodesGoppa codes
NTS-KEMCodesGoppa codes
BIKECodesShort Hamming codes
HQCCodesShort Hamming codes
LEDAcryptCodesShort Hamming codes
ROLLOCodesLow rank codes
RQCCodesLow rank codes
SIKEIsogenySupersingular Isogeny

 

Mathematical problems

Lattice
Security is based on the hardness of tsome mathematical problems in regular n-dimensional structure, e.g., finding of short vectors in a lattice.
Codes
Security is based on the hardness of decoding certain random linear codes.
Multivariate
Security is based on the hardness of solving systems of multivariate polynomial equations.
Symmetric
Security is based on the hardness of breaking a symmetric cryptographic function (e.g. SHA-2 or AES)
Isogeny
Security is based on finding a relation between two elliptic curves.

(*) Correction: an earlier version of this article misspelled Peter Shor’s last name. The article has been updated to use the correct spelling. PFR 8/4/2020

[1] Martin Roetteler, Michael Naehrig, Krysta M. Svore, Kristin E. Lauter: Quantum Resource Estimates for Computing Elliptic Curve Discrete Logarithms. ASIACRYPT (2) 2017

[2] [pqc-forum] Announcement of 2nd Round Candidates, ‘Moody, Dustin (Fed)’ via pqc-forum pqc-forum@list.nist.gov, 30.01.2019

[3] https://www.tum.de/nc/en/about-tum/news/press-releases/details/35882/

[4] https://www.forschung-it-sicherheit-kommunikationssysteme.de/projekte/pqc4med