A funny comic I once read said that if a technology is "about a decade away," it remains '"about a decade away" indefinitely. If you search your memory, you might find this to be true for a cure for cancer, practical applications of graphene or the elusive quantum computing. Nevertheless, eventually leaps happen. Diseases are cured. Cars drive themselves. Man lands on the moon. The hard part from the outside is to differentiate true progress towards a practical application from what amounts to a university press office hyping up a proof-of-concept paper.
I will be honest with you: As a computer scientist, my focus is on machine learning and cloud computing. I am far from an expert in quantum algorithms, which is currently a very theoretical branch of computer science. I can understand the concepts of quantum algorithm papers but I wouldn't know the first thing about the concrete mechanics of engineering one, so excuse my superficial explanations of the physics.
The quantum computing promise
I believe many readers will have a relatively vague idea of quantum physics that might include some conceptions about uncertainty and dead cats. Most will you might also have read about quantum computing here and there as the end to current encryption mechanisms and other seemingly magical promises.
To me, the problem with quantum computing is that there is a truly large gap between a really superficial, vague explanation and a understandable explanation that requires a decent amount of math. So what I think we can do in the length of a Seeking Alpha article is to motivate some principles and try to get the gist of an example quantum computing algorithm to get a feeling for the possibilities. So, without further ado, quantum computing relies on various quantum mechanical effects. The key idea of quantum computing is to encode information in qbits instead of binary bits, thus being able to exploit the effects of quantum superposition, the first effect.
Imagine a classical bit. If you want to make it sound fancy, its state is a linear combination of the amplitude of being in binary state 1 or in binary state 0:
Alpha and beta are the probabilities of being in state 0 or 1, which of course can just be 0 or 1 exclusively for a classic bit. This has a direct connection to electronics (voltage levels). In contrast, a qbit is a two-state quantum system where alpha and beta can be real-valued, so the total state is a superposition from both parts, which can physically be interpreted through up- and down-spin in an electron or horizontal and vertical polarisation in a photon (the bra-kets in this equation are used for quantum states):
which can be visualized through a sphere where the surface shows possible superpositions and the north and south poles show the binary case:
The other quantum mechanical effect often talked about is entanglement, which describes the concept of quantum states where individual particles in the system are not independent from others. I think the puzzle of quantum computing is how we go from this abstract description to massive computing power and what the obstacles in implementing them are.
First, quantum algorithms do not magically try out all solutions to a problem at the same time to quickly find some result and return it to you, as is sometimes perpetrated in the news about this topic.
Quantum algorithms leverage structure in certain problems through quantum superposition. There is a growing body of quantum algorithms, a collection can be found here. In a way, these algorithms correspond to classical number theory and analysis before computers existed. They are either purely theoretical proofs or have been tested on prototype quantum computers with few qbits/quantum gates.
Let's look at the often cited claim of all encryptions being broken by quantum computers. This typically refers to the fact that many modern cryptographic methods rely on the fact that prime factorization of large integer numbers is a hard problem (e.g. 5 * 3 is a factorization of 15), meaning it is infeasible to break keys for current supercomputers in a reasonable amount of time.
Shor's algorithm shows how quantum states can be used to solve the factorization problem and thus break cryptographic algorithms relying on large primes. Note that the largest factorized prime number by a quantum computer (that was publicly announced) is many orders of magnitude smaller than the size of the primes used in for instance RSA encryption (the basis of many mail encryption mechanisms). The crux of the matter is that just because a quantum system can be in multiple states at the same time does not allow you to measure them all (see collapse of the wave function for more). You have to somehow extract the information about the most probable state (that solves your problem) from the quantum system.
Unfortunately, this leads the article to an impasse where it would take several thousand words to describe how this can be tackled through quantum Fourier transformation. For a very accessible explanation of Shor's algorithm (maybe undergraduate level number theory), read this blogpost by a renowned researcher in the area.
IBM's cloud service and practical issues
IBM has now provided the world with a neat web dashboard to a 5-qbit quantum computer. Of course, the resources are quite limited and results of your program will be sent to you via mail after your job has been scheduled. Nevertheless, this will allow anyone with a basic understanding of quantum computing, e.g. from an undergraduate course, play around with quantum algorithms. As hardware evolves, the research and development community will evolve with it and ready itself for a proliferation of quantum computing algorithms.
As I understand, many practical issues remain. For instance, quantum computers need extremely cold temperatures because thermal energy affects quantum states through random thermal fluctuations. Further, the pure engineering of microscopic circuits or ion trapping devices to map logical qbits to a physical quantum state are just very hard to scale, as quantum systems are incredibly susceptible to decoherence and thus require separate apparatus to correct these errors.
Apart from IBM's efforts, Google (NASDAQ:GOOG) (NASDAQ:GOOGL) is also making rapid progress in this area in collaboration with Canadian startup D-Wave (to nobody's surprise). Microsoft is very active behind the scenes and has released a simulator. IBM has now made a leap to give access to an actual quantum computer to a wider community, thus establishing itself as a future service provider in this area, albeit right now only relevant to researchers.
Quantum computing will likely not bring in any needle-moving revenue for IBM or Google in the next decade. It will not turnaround IBM's declining revenue. On the other hand, quantum computing is no longer infinitely a decade away. Make no mistake: Just because we have heard about quantum computing for over a decade without anything happening does not mean it will not come to fruition eventually. There is now a path forward into building a universal quantum computer. IBM's quantum cloud is to be seen as a pre-warming phase of the broader research and industry community for when this happens. As of now, this is more of a free call option for IBM shareholders.
If you enjoyed this article, scroll up and click on the "follow" button next to my name to see updates on my future articles on software, machine intelligence and cloud computing in your feed.
Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.
I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.