Last week I read an article about Lockheed Martin's D-Wave quantum computer, in the New York Times. A Strange Computer Promises Great Speed, by Quentin Hardy, is summarized by the paper as: "Lockheed Martin Harnesses Quantum Technology." The article left me wondering whether Lockheed Martin was getting into the computer business, and whether quantum computers have the potential to make today's computers look like typewriters.
Of course today's computers will be outdated in a matter of time. The question remains, could a different form of computing render the current form of computing obsolete? Once there was a floppy disk, then a 3.5'' hard disk, followed by zip disks and CD-ROMs. Now, consumers can store information on flash drives or in cloud storage. Movies for instance have gone from film, to VHS tape, to LaserDisc, to DVD, to streaming video.
I asked Lockheed Martin (LMT), Intel (INTC) and D-Wave a few questions to try to get a better understanding of whether it was possible for quantum computers to replace today's computers. Before I get to the questions and answers, let's take a look at a few important points to gain perspective about Lockheed Martin's usage of a D-Wave computer. I will also cover some basic concepts to give readers an idea of what the difference between today's computers and quantum computers is.
D-Wave is a very small, private Canadian computer company. The CEO is a former Goldman Sachs (GS) Chief Technology Officer, and one of the founders and current Chief Technology Officer, Dr. Geordie Rose, is a PhD in theoretical physics. D-Wave was originally founded in 1999 and located at the University of British Columbia, where Dr. Rose and his colleagues studied.
After a few years the company developed an early form of quantum computer, the Orion, which made waves in the world of technology. Orion led Google's (GOOG) Technical Manager of Image Recognition to study the possible applications of quantum computing and quantum algorithms on "problems such as recognizing an object in an image or learning to make an optimal decision based on example data."
After Orion the company went back to the drawing board; the result was the prototype for the quantum computer purchased by Lockheed Martin. Lockheed Martin installed its D-Wave One computer at the University of Southern California; home to a top quantum computer research scientist, Daniel Lidar PhD. Lockheed furthered its quantum computing ambitions by creating The USC-Lockheed Martin Quantum Computation Center, directed by Dr. Lidar and Dr. Robert Lucas, an electrical engineer.
Since the perceived success of D-Wave One, the company has received major investments from large venture capitalists:
October 4, 2012 - D-Wave Systems, Inc. today announced that it has closed a $30 million round of equity funding. Bezos Expeditions and In-Q-Tel have joined the investment round. Bezos Expeditions is the personal investment company of Jeff Bezos. IQT is the strategic investment firm that delivers innovative technology solutions in support of the missions of the U.S. Intelligence Community...
As it turns out In-Q-Tel, the science/technology venture capital wing of the Central Intelligence Agency, was founded by former Lockheed Martin CEO Norman Augustine (who was CEO of Martin Marietta before that.) In-Q-Tel's current CEO, Christopher Darby, ran a company bought by Intel Corporation, and was a VP at Intel for a year before moving to In-Q-Tel.
Visionaries in the fields of technology, business and the military are clearly interested enough in the possibility of operational quantum computers to make investments. The simple reason for this is quantum computers could potentially solve problems that currently can not be solved with today's computers, which D-Wave calls "classical computers."
From Classical to Quantum
Though quantum computing is an exciting concept, there are competing views on what a quantum computer actually is. There are also various different possible types of quantum computers.
Whereas a challenge for classical computers is cooling the components because they generate a lot of heat; quantum computers require temperatures near absolute zero. This goes back to concepts expressed by Albert Einstein and Max Planck, the latter of whom received a Nobel Prize for quantum theory in 1919. As modern scientists tackle the challenge of maximizing computational power, while reducing the amount of energy necessary; Planck was working on (among many other things) maximizing the light produced by light bulbs, with a minimal amount of electricity.
Albert Einstein and Max Planck (1931)
While Planck was doing his work around the turn of the century, a few American inventors were developing companies that recorded, and tabulated large quantities of information. A few of the American companies merged to form Computing Tabulating Recording Company in 1911, which became IBM (IBM).
By the mid-1930s a British scientist, Alan Turing had developed a basic concept for computers, known as "a-machine" (automatic machine) or Turing machine. At the same time a German scientist, Konrad Zuse was developing an early computer, the Z1; Zuse continued to advance his early computer designs, as Turing developed a machine, the bombe, to break German codes, for the British.
Simultaneously, a Harvard professor, Howard Aiken devised a computer in the 1930s, eventually built by IBM in the 1940s: The Harvard Mark I. Also in the early 1940s researchers at the University of Pennsylvania were designing ENIAC (Electronic Numerical Integrator And Computer) for the U.S. Army. John von Neumann, a Hungarian American working on the hydrogen bomb, contributed the instruction set to the ENIAC developed by J. Pres Eckert and John Mauchly; ENIAC laid the foundation for classical computer architecture.
The line of computers created by Konrad Zuse, in Germany, advanced to 32-bit memory capacity. While IBM's Harvard Mark I operated with 24-bit punched tape:
Since a bit (binary digit) can be either a 1 or a 0, in the early form of computing, the punched hole equaled a 1, a space equaled a 0. To this day computers use 1s and 0s to handle information. Today's computers can perform billions, trillions or quadrillions of computations per second, depending on the processor(s).
This leads us to a very basic understanding of what a quantum computer is. Instead of bits, quantum computing theoretically can compute qubits (pronounce Q-bits). In 1985 David Deutsch posed the idea of a quantum Turing machine.
Instead of using bits, quantum computers would theoretically use qubits. A qubit can be expressed by one of the fundamentals of quantum mechanics, superposition; the qubit can be partly in two states at the same time (not just up or down, it could be part up and part down, "in an infinite number of ways" P. Dirac.) So, instead of using a 1 or a 0, a qubit can be part 1 and part 0 simultaneously. Theoretically a quantum computer could solve a problem by testing many possible solutions simultaneously, whereas classical computers must test one solution at a time.
Problems that take the fastest supercomputers days, months, or even years; could potentially be solved in seconds, or far reduced periods of time relative to their classical counterparts. Because in theory, effectively a quantum computer could process enormous amounts of data all at once. Since companies like Lockheed Martin work on advanced technology, like the F-35, the potential to solve problems; for instance with complex software, that could not be solved with classical computers, is very valuable.
Where We Stand
On Left: D-Wave founder and CTO Dr. Geordie Rose On Right: Interior of D-Wave Quantum Computer (courtesy of D-Wave)
Scientists have been challenged by many complexities in their research into quantum computing. Certain conditions must be achieved or the system fails; the quantum computer would no longer be able to maintain superposition. As you can imagine one of the issues is the fact that these computers are working on very small scales; as in one photon of light.
The implications of Lockheed's usage of the D-Wave One are vast. The company released a YouTube video last month stating:
We're in that same kind of moment in time, we're looking at the first or second transistor. There is a revolution taking place, it is kind of exciting to watch it play out.
My question for Lockheed Martin was simple, I wondered whether the company is getting into the computer business. Lockheed Martin's spokesperson said:
We are not getting into the quantum computing business. We're certainly using and testing the machine.
The company also provided key points:
- Quantum computing will enhance our capabilities to engineer the next generation of increasingly complex systems and technologies for customers while reducing costs.
- We are focused on four key research threads:
- System Verification and Validation (V&V): To wring out faults and deficiencies in software-intensive systems and develop testing methods to prove that highly complex systems perform as designed
- Complex Systems Engineering: Evaluate testing methods for the early stages of systems development...
- Quantum Algorithms and Computer Science: Develop key technologies that will improve our use of the quantum computing resources
- Quantum Enhanced Machine Learning: Develop machine learning technologies for a wide range of applications in intelligence, pattern recognition, medical diagnostics and other areas
I asked Intel what its quantum computing plans were, and whether quantum computing could render Intel obsolete. Chuck Mulloy, an Intel spokesperson answered:
We do a lot of research a year. We don't happen to be making a quantum computer at this stage.
Mr. Mulloy added that Intel does a lot of research and "all those thousands of ideas we are looking at could render (our current technology) obsolete, unless we invest... Should we look at quantum computers? Absolutely."
I asked him whether he thought Intel would move from bits to qubits and he said "no."
On Left: Intel's Xeon Phi 512-bit SIMD processor On Right: D-Wave's 512 qubit Vesuvius processor
Dr. Colin P. Williams, an expert on quantum computers and D-Wave's new Director of Business Development and Strategic Partnerships, also answered a few questions for this article. D-Wave's Chief Technology Officer, Dr. Rose, learned what quantum computers are from Dr. Williams' research. A former assistant to famed theoretical physicist Stephen Hawking, Dr. Williams holds a PhD in artificial intelligence.
I first asked D-Wave if they thought quantum computers would render current technology obsolete; the simple answer is not at this point. My sense is quantum computers have a great deal to add to the world of technology. However, as you've read quantum computers are in their infancy. What they have the capacity to do is to optimize businesses.
I asked Dr. Williams why we might not see quantum computers on desktops:
Dr. Williams: "Conventional computers do some tasks, such as running word processing software, pretty well. For such tasks there would be no compelling advantage from using a quantum computer. But there are other tasks that conventional computers do poorly, such as general purpose artificial intelligence, natural language understanding, and robotics. For example, I cannot yet have a human-quality conversation with the digital assistant on my smart phone. At best the dialog is choppy and in many cases my digital assistant completely misunderstands my true intent. But it turns out that the underlying computations needed to make better artificial intelligences, digital assistants, and robots are exactly the sorts of computations quantum computers can solve faster or better than classical computers."
I also asked, "is it accurate to say, that the application of D-Wave quantum computers is for behind the scenes technology? To clarify, you may not have a quantum computer laptop or phone, however the quantum computers could greatly add to the ability to design consumer devices?"
Dr. Williams: "I foresee quantum computers playing a pivotal role in the development of next generation artificial intelligence and machine learning applications. For example, the D-Wave quantum computer has been used to develop so-called "binary classifiers". These are pieces of conventional software that decide things such as "Is there a tumor in this radiograph?". The key is to use the quantum computer to find the classifier and, once found, deploy it in conventional devices such as computers, tablets, smart phones etc.
The way this works is as follows: one starts with lots and lots of "weak classifiers", which are features in the images that individually might provide some feeble and not especially reliable clue that a form in a radiograph might be a tumor. What the quantum computer can do is figure out what combination of these weak classifiers, when seen together, provide the most reliable indication that the form is or is not a tumor. In an example like this the quantum computer is not being used at run time to do the diagnosis. Rather the quantum computer is being used behind the scenes to find the superior classifier that doctors can then run on conventional computers. So quantum computers and conventional computers are not necessarily in competition, as is often portrayed in the media. Rather they complement one another to enable new capabilities neither would achieve alone."
Quantum Physics Partially Applies & Partially Does Not Apply to Investing: What Investors & Scientists Should Know
I did not want to leave any stone unturned in this article; however, as it turns out the stone is partially turned and partially unturned. In the past it may have been unimaginable that a modern technology would be totally replaced. Yet time and time again that is exactly what happens.
Based on my research, it seems fair to conclude that a realistic near-term outcome of the introduction of quantum (or quantum-like) computers, may be a hybrid. Classical computers may interact with clouds powered by quantum computers, it may take years, or decades; however, this seems to be probable.
Some futurist consumers wonder if we will ever have quantum powered laptops and phones. Though it seems more likely that we will not, because of the major challenges, and the simple reason that they may not be necessary. However, as Dr. Williams suggested, perhaps we will be able to more fluently speak with computers; if the centerpiece that remotely powers a classical computer application is linked to a quantum computer, or work that is done by quantum computers.
That said, investors should be focused on companies that are profitable. Larger investors who can afford to take risks, and seek actual ownership, realize there is potential beyond current profitability or lack of profitability. However, often stocks tend to perform well, when the company is profitable.
Take Applied Energetics (AERG) for instance, this company sounded like it did great work in the field of lasers. It loses millions of dollars a year though, and has gone from $6 a share to $0.04 cents a share; even if it does manage to pull in some net income one quarter in the future, it was not a good investment. Intel on the other hand is an established technology company, it pays a 4% dividend and pulls in over $2.5B net income a quarter.
It is wise for investors to consider established, profitable companies; and it is wise for those established companies to pursue technology and companies that could, one day, make breakthroughs that will change their respective industries. Technology is an important part of a financial portfolio, though conservatively it could amount to anywhere from 2% - 10% of total allocation. The industry has the potential to be profitable and also has the potential to be volatile.
Investors may also gain exposure to brilliant scientific work, through university corporate issues and university municipal bonds. In addition to stocks, investors interested in science and technology can search for 4 and 5 star Morningstar rated mutual funds that focus on these sectors, and technology company corporate bonds. So, whether you are an investor, scientist, or consumer interested in technology, keep in mind there are various ways to invest in the future. Don't try to invest all at once, because an important concept to understand is the necessity to sustain investments over time.
If you have any thoughts on quantum computers, particularly as they relate to Lockheed Martin, or Intel please leave a comment below.
Additional disclosure: This article is not a recommendation to buy or sell, please consult a financial adviser to determine proper allocations (if any) to the technology sector.