Scientists recently conducted a seemingly strange experiment that originated on a Chinese quantum-enabled satellite orbiting 300 miles above the earth. The researchers, based in China, shot a laser through a special crystal on the satellite Micius to create a pair of photons, or packets of light. They then beamed the photons to two ground bases, one to each base, 750 miles apart within mainland China.

The experiment achieved not one, but two, breakthroughs in quantum communications. One involved the first time that humans have produced a phenomenon known as quantum entanglement in outer space. The other represents the longest ever transmission of entangled photons. The previous distance record was approximately 86 miles.

Jian-Wei Pan, a physicist at the University of Science and Technology of China in Shanghai, led the project. His team's results were published in a paper this month in the journal *Science*.

The researchers' breakthroughs could prove significant to the future development of unhackable communications.

Quantum entanglement involves creating a pair of particles, usually photons, which become fundamentally correlated. Once entangled, the particles mirror the behavior of each other. Any change in one will occur equally and simultaneously in the other, no matter how far apart the particles are in the universe. Additionally, the measure of a physical state of one is always the exact opposite of the other. For instance, if the spin of one photon along a Y axis is up, then the spin of its entangled pair will be down. If the spin of one photon along an X axis is to the right, then the spin of its entangled pair will be to the left.

Albert Einstein first discussed the phenomenon of quantum entanglement in a 1935 paper coauthored with Boris Podolsky and Nathan Rosen, although they didn't use the term "entanglement" when referring to what eventually became known as the EPR (Einstein-Podolsky-Rosen) Paradox. The term "entanglement" originated later, in a letter written in German from Erwin Schrödinger to Einstein, in which Schrödinger used the German word *verschränkung*, which he interpreted as "entanglement."

Physicists have long understood how to create a pair of entangled particles, although there's still no proven scientific explanation for how one particle "knows" what's happening to its entangled counterpart. Physicists have proposed two theories to explain the phenomenon.

One theory holds that there are embedded, hidden variables created at the pair's inception and shared indefinitely. Whenever a variable in one changes, the same variable changes in the other, resulting in correlated behavior. The problem is that Bell's Theorem mathematically disproves this explanation.

A second theory holds that there's covert communication between the two particles. However, a change in one particle results in an immediate change in the other, regardless of the distance separating them. Therefore, to communicate, information from one particle must travel to the other instantaneously – that is, faster than the speed of light. This explanation troubled Einstein and Schrödinger, as well as contemporary physicists, because it violates the theory of relativity.

Despite the absence of a scientific explanation, Einstein and Schrödinger recognized the significance of quantum entanglement, which Schrödinger summarized by writing, "I would not call [entanglement] *one* but rather *the* characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought."

In a 1945 letter to Max Born, Einstein called the phenomenon "spukhafte fernwirkung," or "spooky action at a distance."

Although quantum entanglement is still not fully understood, physicists continue to discover new ways to apply the phenomenon, with potentially huge implications for information security in the emerging age of quantum computing.

Security experts view quantum cryptography as a potential solution to the emerging threat that quantum computing poses to traditional methods of asymmetric cryptography, also known as public-key cryptography (PKC). The most common asymmetric cryptosystem is RSA, named after its developers Ronald Rivest, Adi Shamir and Leonard Adleman. PKC has been widely implemented and is presently critical to a safe and secure internet, with applications ranging from email to eCommerce.

Current symmetric encryption algorithms, such as the Advanced Encryption Standard (AES), are thought to be secure against quantum computers for the foreseeable future. While some techniques, such as Grover's algorithm, increase the efficiency of breaking symmetric ciphers, increasing the key length preserves security.

Lengthening symmetric keys is viable because symmetric systems operate faster than asymmetric systems. For instance, the symmetric algorithm Data Encryption Standard (now obsolete) is 100 times faster than RSA when using software and up to 10,000 times faster when using hardware.

Asymmetric cryptography arose in the 1970s as a solution to a common problem in symmetric cryptography. In symmetric cryptography, the same key (a mathematical value entered into a cryptographic algorithm) encrypts and decrypts communications. The challenge becomes how to exchange the key securely between a sender and a receiver – say, George and Jane – over distance or via insecure channels without a third party – say, Elroy – intercepting the key and reading private communications. (Fifth Domain reached out to Alice, Bob and Eve for this example, but they were presently engaged in examples by every other writer on cryptography.)

Asymmetric cryptography solved this problem with ingenious mathematics. Asymmetric cryptosystems create a set of keys – one public and one private – for each communicator, George and Jane. The keys are different yet mathematically related. George's private key decrypts all his communications, and Jane's private key decrypts all her communications. Jane uses George's public key to encrypt the communications she sends to him, and George uses Jane's public key to encrypt the communications he sends to her. If George and Jane safeguard their respective private keys, then no one else can read communications encrypted via their public keys. In this way, George and Jane are free to widely share and distribute their public keys, which can't be used to decrypt their communications.

Current PKC relies on complex mathematical problems to generate keys. The most frequently used today include the integer factorization problem, the elliptic curve discrete logarithm problem and the discrete logarithm problem. The trouble is that each of these mathematical problems can be solved by powerful quantum computers using methods such as Shor's algorithm and Adiabatic quantum computation. Such techniques jeopardize the security of traditional PKC in the future.

For instance, the integer factorization problem is used to protect information by using large prime numbers for keys. The prime numbers must be factored to crack the encryption. Traditional computers don't have the processing power to factor large prime numbers efficiently. However, specialized algorithms, such as the Quadratic Sieve and the General Number Field Sieve, novel approaches and the increasing power and scalability of quantum computers are leading to more efficient factoring of larger and larger prime numbers, which will eventually render traditional mathematics-based PKC weaker, if not obsolete.

A connected, though separate, problem with current PKC relates to methods for generating random numbers. Any mathematically secure cryptosystem will apply a high degree of randomness to several tasks, notably key generation, thereby rendering it improbable for adversaries to guess. Some in the security community have become wary of algorithms for generating random numbers, particularly several standardized by the National Institute of Standards and Technology (NIST). Experts have speculated that the NSA may have influenced some of NIST's approved random-number generators, essentially allowing for encryption backdoors. NIST has since rescinded some recommendations.

There are several potential solutions to problems with current PKC systems. The first is simply to lengthen keys. The challenge is that increasing asymmetric key length requires significant computational power, which slows processing and drains battery life on mobile devices. At some point, lengthening keys to thwart quantum brute-forcing will inhibit standard computing.

A second potential source of a solution is the field of post-quantum cryptography, which involves developing different mathematics-based systems that can better withstand the brute-force power of quantum computers and the algorithms that increasingly make breaking existing cryptosystems efficient. Post-quantum cryptographic solutions include the McEliece cryptosystem – named after Robert McEliece's work in 1978 – and lattice-based systems, such as NTRUEncrypt.

A third potential solution is the field of quantum cryptography. Quantum cryptography moves away from mathematically derived keys altogether and instead uses quantum mechanical properties as the basis for keys. Quantum mechanical properties include, for instance, various measures of the state of entangled particles.

Quantum cryptography relies on a few principles of physics to ensure security. The first is called the observer effect, which states that mere observation, even passive, changes quantum phenomena. The observer effect is sometimes confused with Heisenberg's Uncertainty Principle, but the two are related yet distinct.

The observer effect is significant to quantum cryptography because the mere act of observing or measuring quantum phenomena – say, a property of one or both entangled particles – would create an equal and instantaneous change in each. Such a change would alert users to eavesdropping.

The second principle is called complementarity. The principle of complementarity reflects the fact that all properties of objects exist only in pairs. These pairs cannot be observed or measured simultaneously. The Heisenberg Uncertainty Principle quantifies the principle of complementarity.

The classic example of complementarity is a photon's wave-particle duality. A photon can be observed and measured as a particle, with a definite position and momentum, or as a wave, but not as both a particle and wave simultaneously. The more precisely one property is observed and measured, the less that can be known about the other property, and vice versa. Other complementary properties include energy and duration, spin on different axes and entanglement and coherence.

The third principle of physics that ensures the security of quantum cryptography is the no-cloning theorem, which states that it's impossible to make a perfect copy of an unknown state.

In summary, if it's impossible to observe a quantum phenomenon without changing it (observer effect), and it's impossible to measure certain physical properties of the phenomenon simultaneously (principle of complementarity), then adversaries cannot know the precise state of a phenomenon – say, entangled photons – which makes copying it impossible (no-cloning theorem).

Perhaps the most well-known application of quantum cryptography is quantum key distribution (QKD), which cryptographers have theorized in various forms for nearly 50 years.

Stephen Wiesner first proposed QKD in the early 1970s based on a method using quantum conjugates. Wiesner's idea relies on the use of observable conjugate variables, which is the term Niels Bohr used to describe the dualities underlying the principle of complementarity.

As a conjugate observable, Wiesner proposed the linear and circular polarization of light. Either the linear or the circular polarization of light, but not both, could be received and decoded by parties exchanging secure communication.

In the mid-1980s, Charles H. Bennett at IBM and Gilles Brassard of the Université de Montréal built on Wiesner's work to develop BB84, which was the first quantum cryptography protocol. Commercial applications of QKD today use BB84.

Then, in 1991, Oxford University Quantum Physicist Artur Ekert proposed what became known as the E91 protocol, which is a method of QKD based on the quantum entanglement of photons.

One method for creating entangled photons involves shooting a laser through a special crystal, as Pan did in his experiment. Then measure the complementary property of the polarization of light (represented as 0 or 1) as the basis for encoding keys. (Digital bits require representation as 0 or 1, but being a digital representation of a quantum state, qubits are different in that their states can be measured as one [1], the other [0] or both, a phenomenon known as superposition, but for simplicity, this example uses a strictly digital 0 or 1 representation of the photon's quantum property.) This method of generating keys contrasts with using large prime numbers or other mathematics-based systems, as commonly applied in asymmetric algorithms today.

For example, suppose George and Jane want to use QKD via quantum entanglement to exchange secure communications. One potential application might go as follows. They first create a pair of entangled photons, with each of them possessing one of the photons comprising the pair. (The fact that George and Jane are likely more than 86 miles apart highlights the importance of distance achieved in Pan's experiment.) Jane then randomly measures the polarization (e.g., vertical, horizontal, diagonal) of her photon. George randomly measures the polarization of his photon. They then publicly share how they each measured their respective photons, but not the results of their measurements.

Since each entangled photon reacts to any measurement of the other in equal and instantaneous proportion, Jane checks her photon to ensure George's measurement types were accurately reflected, and George checks his photon to ensure Jane's measurement types were accurately reflected. If they match, and there are no additional changes to the photons (which would indicate Elroy eavesdropping), then George and Jane convert the data gathered randomly from their measurements into a string of (qu)bits that form their cryptographic key for secure communication.

The laws of physics (observer effect, law of complementarity and no-cloning theorem) ensure that Elroy cannot copy their newly generated cryptographic key.

Ekert's idea was first successfully applied to secure a money transfer in Austria in 2004. Nine years later, the Ohio nonprofit Battelle Memorial Institute brought QKD to the U.S. for the first time.

Another significant breakthrough came in 2014, when Ronald Hanson and his team at the Delft University of Technology in the Netherlands entangled electrons 1.3 kilometers apart, which was at that time a record distance.

Despite recent successes, QKD via quantum entanglement still faces limitations in governmental and commercial applications. For one, particles can be transmitted only a limited distance on earth. The limitation results from the requirement to transmit particles using optical fiber, which degrades particles the farther they travel.

Pan's team achieved the record-breaking distance in part because space is a vacuum, so there's nothing in space to degrade photons while they travel. Pan's team is already planning the next experiment, which will attempt to entangle photons in outer space and then beam them to ground stations located across a continent.

Another challenge has been the limited quantity of entangled particles that can be generated and then successfully transmitted. Hanson's 2014 experiment, for instance, involved just 245 trials, "which is statistically low and not commercially viable," the researchers noted. Pan's team generated 10 million photon pairs, but only one pair made it to their destination ground stations.

If scientists can overcome these challenges to achieve a viable, enterprise-grade application of QKD based on quantum entanglement, it will usher in a new era of perfect secrecy for data, regardless of the advancements in quantum computing.

## Comments