Cornify Quantum Computing and Cryptography: 04/21/15

Tuesday, April 21, 2015

Blog Post One - An Introduction to Quantum Computing and Cryptography

Before you read deeply into this blog, it is crucial you understand the basic concept of quantum computing. While the complexity of the subject is vast as many of quantum computing's capabilities and applications continually being discovered, this emerging field will likely shape our way of life as we know it as research progresses.

In order to understand what "quantum" computing is, you must first understand the difference between regular computing and "quantum" computing. The difference is literally in the physical properties of the bits, combining the physicality of quantum properties at the atomic level to alter how we can compute and read streams of bits. In traditional computing, these streams of bits can be directly translating into 1s (the bit is on) or 0s (the bit is off). We do this by passing positive (1) and negative (0) charges of voltage through ICs (Integrated Circuits). Computers can read binary streams of data (sequences of 1s and 0s), and convert them into useful instructions and operations to perform tasks. In traditional computing, each bit will only maintain a state of either 1 or a 0, but never neither nor both. This is not the case in quantum computing which uses Qubits. For a thorough explanation of this fundamental difference, I have provided a video below which I highly encourage all of you to watch:


Andrea Morello is a scientist from the University of New South Wales (depicted in the video). He explicitly states towards the end of the video that Quantum Computing has very special applications and specifically mentions that it is only useful in scenarios that can exploit the use of quantum algorithms. Morello continues to explain that "Quantum Computing is not a replacement for classical computers" and mentions that they are not universally faster than normal computers. He explained that for typical tasks such as watching a video or creating a document, normal computers would likely be faster. The key point Morello mentions is: Quantum computers don't perform operations faster than normal computers, but the number operations required to resolve at a result is exponentially smaller. 

One of the niche uses where we can exploit the quantum computer's property to make the number operations required to resolve at a result is exponentially smaller is Cryptography. Cryptography is a field where we use mathematical algorithms to encrypt and decrypt (encode or decode), information (sequences of bits). The purpose of doing this is to secure information so that only those people whom we wish to have the right/privilege to view the information are able to do so while others are not. An example which easily explains this concept is when you purchase something from an online vendor using a credit card. Your credit card information is encrypted, the message is send through the internet, and decrypted once it reaches the vendor's server computer so that they may use your credit card number. We encrypt the information so that if some other party on the network you were using intercepted the information, it would be completely erroneous and unusable as the data would not be represented sensibly.

Now that we understand the basic concepts of quantum computing and uses of cryptography, we can discuss while the concept of Quantum Cryptography is so important. I will go into detail in a later post, but here some food for thought, for now. Watch the following video on Heisenberg's uncertainty principle so you can better understand the physical randomness quantum cryptography offers:


Quantum cryptography uses the uncertainty of measuring particles to generate randomness in quantum algorithms. This randomness helps create algorithms which are virtually impossible to decrypt. Quantum cryptography uses Heisenberg's uncertainty principle as a basis for creating a random algorithm to encrypt data. “The principle bounds the uncertainties about the outcomes of two incompatible measurements, such as position and momentum, on atomic particles” (Berta, 2010). “It implies that one cannot predict the outcomes for both possible choices of measurement to arbitrary precision, even if information about the preparation of the particle is available in a classical memory” (Berta, 2010).



Reference:
Berta M., Christandl, M., Colbeck, R. Renes, J, Renner, R. (2010) The Uncertainty Principle in the Presence of Quantum Memory. Macmillan Publishers