Keynote by Brian LaMacchia: “Post-Quantum Cryptography”

by Valerie Lopez of PRLinks for FIRST
Friday, June 16th, 2017

Brian LaMacchia Keynote

The FIRST Conference’s Keynote sessions concluded today with a presentation by Brian LaMacchia, Director of the Security & Cryptography group within Microsoft Research (MSR). In this department, his team conducts basic and applied research and advanced development. LaMacchia is also a founding member of the Microsoft Cryptography Review Board and consults on security and cryptography architectures, protocols and implementations across the company. Before moving into MSR in 2009, LaMacchia was the Architect for Cryptography in Windows Security, Development Lead for .NET Framework Security and Program Manager for core cryptography in Windows 2000. Prior to joining Microsoft, LaMacchia was a member of the Public Policy Research Group at AT&T Labs—Research.

In addition to his responsibilities at Microsoft, LaMacchia is an Adjunct Associate Professor in the School of Informatics and Computing at Indiana University-Bloomington and an Affiliate Faculty member of the Department of Computer Science and Engineering at the University of Washington. He also currently serves as Treasurer of the International Association for Cryptologic Research (IACR) and is Past President of the Board of Directors of the Seattle International Film Festival (SIFF). LaMacchia received S.B., S.M., and Ph.D. degrees in Electrical Engineering and Computer Science from MIT in 1990, 1991, and 1996, respectively.

LaMacchia’s talk centered on upcoming changes in cryptography that will be needed to accommodate quantum computing. In such world, cryptography will need to be based on a new set of math problems to provide a similar level of cyber security.

He explained that in August 2015, the Information Assurance Directorate of the US National Security Agency (NSA) announced plans to begin a transition from the existing “Suite B” cryptography to quantum-resistant algorithms. Since Peter Shor of AT&T Bell Laboratories first published an efficient quantum algorithm for factoring in 1994, researchers have known that when a general-purpose quantum computer of sufficient size is built, then commonly-used public-key cryptographic algorithms will be broken.

This is a great concern in the research and cryptology communities even though, as LaMacchia pointed out, there isn’t a large-scale quantum computer right now. However, recent progress in the physics and engineering of quantum computation is changing assumptions about the feasibility of building a cryptographically-relevant quantum computer. While there are still technical challenges to address, the best estimates today are that such a machine could become feasible in as little as 10-15 years. LaMacchia noted that a well-funded organization could have one of these machines developed even sooner than that.

“Given our experience with past cryptographic algorithm transitions, this time horizon means that we need to start today the process of identifying hard problems that are quantum resistant, developing efficient cryptographic algorithms based on those problems, standardizing these algorithms and deploying them broadly, and deprecating our existing public-key cryptosystems,” he explained.

Still, La Macchia posed some important questions, such as how long will it take to migrate the whole Internet to post-quantum cryptology. “Some of today’s data and communications still need to be secure in 20 or more years. How long will it take to re-encrypt this data with post quantum schemes?”, he asked.

LaMacchia defined post-quantum cryptology as “cryptographic schemes that are believed to be secure even if a large quantum computer exists.” As they started to work with this type of cryptology, LaMacchia realized that the research community was just in the nick of time to come up with this solution. Currently, cryptology teams are hard at work in the National Institute of Standards and Technology (NIST) PQ Competition, which began with a call for proposals in May 2016. These are due this November. According to the NIST timeline, LaMacchia hopes development and analysis will be completed by December of 2013.

Along with LaMacchia, researchers are concerned with designing better post-quantum key exchange and signature schemes, improving classical and quantum attacks, picking parameter sizes and developing fast and secure implementations. LaMacchia’s team at Microsoft is working with both lattice and isogeny-based systems in post quantum encryption.

The switch to post-quantum cryptology may bring to mind the Y2K controversy of the late 1990s, but according to LaMacchia, the change will not have that much impact on computer users or even be that much perceivable. Still, it is an urgent matter that the research community is taking very seriously and working on diligently.

“Quantum computers are coming, maybe not for a decade or more, but within the protection lifetime of data we are generating and encrypting today,” LaMacchia told the audience. “We need to start planning the transition to post-quantum cryptographic algorithms now.”

To prepare for the post quantum transition, LaMacchia stated that all systems need cryptographic agility. “Hybrid solutions combining classical and post-quantum primitives look promising; they provide both traditional cryptographic guarantees as well as some post quantum resistance,” he added.

LaMacchia also explained that there are practical engineering options currently available for deploying post quantum technology, but it’s going to take a long time to update software stacks. “The easiest, quickest response will probably be to wrap everything that is not post quantum in a post quantum VPN or a similar tunnel,”

LaMacchia concluded his keynote by sharing some post quantum open source references with the audience.