There is a variety of post-quantum public-key cryptography. These cryptosystems are based on a few different intractibility assumptions: multivariate polynomials, lattices, error-correcting codes and supersingular isogenies.
Code is available for most of these proposals because NIST is currently running a PQCrypto Standardization CFP[1]. However I strongly recommend against deploying your own post-quantum cryptosystem for a few reasons:
1. As I've cited elsewhere in this thread, this paper notwithstanding, most leading researchers in the field are frankly bearish about the prospects of 2048-bit RSA being broken in the next 20 years or so.[2] This can obviously change, which leads me to the next few points of consideration.
2. Most of the post-quantum cryptosystems are not well-studied, relatively speaking. They are fundamentally unproven compared to classical systems - we just haven't had enough academic scrutiny yet. It's still premature to say which post-quantum cryptosystems will remain secure under academic scrutiny over the next few years. Many were dropped from consideration after Round 1 of the NIST standardization review.
3. Most (perhaps all) of the implementations are immature and not well-supported. The majority of them are proofs of concept for NIST, not production-ready code. Authors themselves will caution you against using them. We don't have a libnacl for post-quantum public-key cryptography right now, which means that you'd be substantially rolling your own interfaces to underlying primitive implementations. It's hard enough maintaining secure cryptography in production when everything has been done to keep you from footgunning yourself - you won't have such guardrails for post-quantum cryptosystems.
4. Unfortunately, all post-quantum cryptosystems are grievously inefficient in either time or spatial performance compared to classical cryptosystems. As a general rule of thumb, lattice and error-correcting code based cryptography tends to be on the faster side with very large key requirements, and isogeny-based cryptography tends to be on the slower side with lower key size requirements. But all are noticeably slower than classical systems across both dimensions.
You should wait until these cryptosystems have been proven out by academic and industrial research. Google[3] began implementing lattice-based cryptography for TLS in Google Chrome in 2016. Adam Langley has a nice writeup[4] which also includes a few performance concerns. He's also written a blog post to talk about the next round of implementations they'll start experimenting with[5].
Code is available for most of these proposals because NIST is currently running a PQCrypto Standardization CFP[1]. However I strongly recommend against deploying your own post-quantum cryptosystem for a few reasons:
1. As I've cited elsewhere in this thread, this paper notwithstanding, most leading researchers in the field are frankly bearish about the prospects of 2048-bit RSA being broken in the next 20 years or so.[2] This can obviously change, which leads me to the next few points of consideration.
2. Most of the post-quantum cryptosystems are not well-studied, relatively speaking. They are fundamentally unproven compared to classical systems - we just haven't had enough academic scrutiny yet. It's still premature to say which post-quantum cryptosystems will remain secure under academic scrutiny over the next few years. Many were dropped from consideration after Round 1 of the NIST standardization review.
3. Most (perhaps all) of the implementations are immature and not well-supported. The majority of them are proofs of concept for NIST, not production-ready code. Authors themselves will caution you against using them. We don't have a libnacl for post-quantum public-key cryptography right now, which means that you'd be substantially rolling your own interfaces to underlying primitive implementations. It's hard enough maintaining secure cryptography in production when everything has been done to keep you from footgunning yourself - you won't have such guardrails for post-quantum cryptosystems.
4. Unfortunately, all post-quantum cryptosystems are grievously inefficient in either time or spatial performance compared to classical cryptosystems. As a general rule of thumb, lattice and error-correcting code based cryptography tends to be on the faster side with very large key requirements, and isogeny-based cryptography tends to be on the slower side with lower key size requirements. But all are noticeably slower than classical systems across both dimensions.
You should wait until these cryptosystems have been proven out by academic and industrial research. Google[3] began implementing lattice-based cryptography for TLS in Google Chrome in 2016. Adam Langley has a nice writeup[4] which also includes a few performance concerns. He's also written a blog post to talk about the next round of implementations they'll start experimenting with[5].
_______________________
1. https://csrc.nist.gov/Projects/Post-Quantum-Cryptography/Pos...
2. https://www.nap.edu/catalog/25196/quantum-computing-progress...
3. https://security.googleblog.com/2016/07/experimenting-with-p...
4. https://www.imperialviolet.org/2018/04/11/pqconftls.html
5. https://www.imperialviolet.org/2018/12/12/cecpq2.html