# Get to Know Quantum Computing

## Learn what sets it apart, its future and use cases

8/16/2017 12:30:19 AM | By Patrick StanardI have attended a myriad of conferences this year and have also listened to a myriad of presentations. Some of these presentations have been “snorefests.” I must say the very best presentation was on the topic of quantum computing. I must also say I am no expert on it, but I have learned enough to be “dangerous” and I really feel very strongly we all need to have a basic understanding of what it is and how it will be affecting our future in IT. To begin, throw out all you currently know about traditional data recording practices using bits and bytes.

**Quantum Computing Defined**

So, what exactly is quantum computing? Based on the Wikipedia definition: Quantum computing studies theoretical computation systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors. Wow. What a mouthful. Let’s try to make some sense of all of this.

One of the most important things to understand is quantum computing uses something called superposition, which is a new approach and frees us from the traditional binary representations. Normally, we represent data via binary representation—a zero or a one depending on if the bit position is on or off. With a quantum computer, those bits are really particles that can be in this state of superposition which give them a represented value of zero, one or both simultaneously. Knowing this, you can immediately understand how this added state can affect speed of processing. This new state in superposition is known in quantum computing as a qubit. It seems like Noah measured the ark in qubits, but I digress.

In quantum computing’s microscopic world, things aren’t as clear cut as what we know in our current world. Particles like electrons or photons can take on states simultaneously that give a new ability to represent data in ways that have been unimaginable. The idea of mutually exclusive states is gone in favor of particles in superposition. This says the particles can be in many places all at once and additionally, in the case of photons, they can exhibit two different kinds of polarization, which gives even more states.

We never see this superposition of different states in ordinary life because it somehow disappears once a system is observed: When you measure the location of an electron or the polarization of a photon, all but one of the possible alternatives are eliminated and you will see just one. Nobody knows how that happens, but it does. (You can find out more in Schrödinger's equation — what is it?) Ah, quantum physics hard at work.

**The Future**

So now that I have you (and me) completely confused, let’s explore why quantum computing has such promise for our future.

Another concept with quantum computing that needs to be explored is the idea of “entanglement.” Entanglement describes a system of several qubits using classical information, like bits or numbers. This isn't about stringing together the descriptions of the qubits. What you need is to understand all the correlations between the different qubits. Obviously, once you increase the number of qubits, the corresponding correlations grows exponentially. So for n qubits there are 2 to the nth power correlations. You can imagine this number will explode, and in describing a system of, let’s say, 200 qubits you'd already need more numbers than there are atoms in the universe. The main idea here is since you can't hope to record all of the information contained in a system of just a few hundred qubits using normal bits and bytes but if you had a computer that ran on qubits, you would be able to perform the task that a traditional computer could only hope to accomplish. And therein lies the true reason why physicists believe quantum computing is the key to the future with IT.

Physicists Richard Jozsa and David Deutsch talked in depth about how quantum computing actually works. The one thing to understand is a quantum computer can read all of the superposition states at one time, so all of the qubits are immediately understood. Jozsa and Deutsch were able to prove it's possible to run an extra operation on your quantum state of the qubit to extract the simple piece of information you are after and put in into just the right places for you to be able to read it.

Think of it as a house of cards that will collapse as soon as you look at it. You might never be able to see it in its full glory, but if it was constructed in just the right way, you may at least be able to ascertain some information on what it looked like from the collapsed heap. And that's one reason why quantum computers are more powerful than classical ones. Classical systems have no choice but to evaluate all of the components individually. A quantum computer has the ability to evaluate all of the components at the same time.

**Use Cases**

The next question is in knowing how quantum computing works, how will it affect us in the real world?

In practice, quantum computing has the ability to greatly speed up the access of data. Think about searching for a particular phone number in a phone book, and yes, I used to use a phone book. Traditional systems are forced to read each entry or, in using traditional searching methods, do a halving process. This is time consuming. With classical computing, the system will need n operations were n is the total number of entries in the phone book. With quantum computing, the system would need the square root of n operations. This may not seem like a big deal, but when you consider when n is large—let’s say a million entries—the quantum system would only need one thousand operations to perform the same process.

Another solid use case for quantum computing will be in the area of medicine, chemistry and biology. In producing new drugs, it’s very important to understand the molecular system that makes up that drug and also the overall behavior. This behavior of the molecular system and corresponding behavior can be modeled via a quantum system much more easily than using traditional systems. The problem is molecules are made up of particles, which are all subject to the world of quantum mechanics. Enter the quantum computer.

The world of cryptography and security will also reap benefits from quantum computers. Due to quantum physics, the quantum state changes when it is observed (e.g., understanding this it’s possible to construct ways of determining if a message was read). Thinking this through, it would be possible for users to send each other encryption keys that could be used to encrypt and decrypt messages and be notified if a key was intercepted. This process has actually been used in the past, but the massive speed quantum computing can attain would be a game changer in all aspects of security.

**Learn More**

IBM is offering a quantum experience and lab. It’s fascinating and thought provoking, and you can actually experience the quantum computing technology. Additionally, IBM has recently built two new quantum processors that are proving this technology is a game changer for us all in the world of IT. IBM has a five-qubit system currently and will be moving to a 16-qubit. They have a roadmap to build a 50-qubit system by 2020.

The world of quantum computing is erupting and I think you can see this will be a disrupting technology we will all be experiencing in the very near future.

*Patrick Stanard is a z Systems Architect Manager for IBM. He’s a 34-year professional in the industry, spanning roles as a systems programmer, developer, manager, adjunct faculty member and director of operations. He has a Bachelor of Science in CIS from Saginaw Valley State University and an MBA from Michigan State University.*

» More Articles

### IBM DB2: Taking the Best Data Server on a Transformational Journey

The IBM database product focuses on continuous delivery to give clients updates quicker.

### A Glimpse Inside Db2 for z/OS

Mid-level technical papers focus on a specific topic related to Db2 for z/OS.