If you've followed Intel (INTC) for any length of time, you've probably heard an argument along the lines of "we don't need more powerful chips; what we have is good enough". For example, ZDNet asserts that the real reason for PC sales decline is that we've entered the era of "good enough" computing. IDC's Bob O'Donnell agrees: "we are truly in an era of good-enough computing." Analysts everywhere bemoan the death of the PC and point to shiny new messaging apps (that, for all intents and purposes, do exactly the same things as the old ones they replace) as the area where consumers are really excited about progress.
This viewpoint is at radical odds with reality.
The idea that computing is ever "good enough" represents a fundamental misunderstanding of the role of technology - and more specifically, microprocessors and Moore's Law - in our lives. Yes, it is true that computers are good enough for what we use them for now, but that is an inherently tautological statement. There is no incentive for software providers to design tools that require a materially higher level of computing power than is available to their target demographic; doing so would not be profitable. However, they design each successive iteration of their product to take advantage of increased computing power; in fact, they require this increased computing power to continue delivering more powerful solutions.
Lest this sound like hand-waving by an admitted Intel bull to distract investors from the massive contra-revenue/NRE charges the company will take in 2014, rest assured that there's actual quantitative data to back this assertion. For that, let's turn to a report from the smartest guys in the room - no, not Goldman; even though they're looking for a good Q4, they don't get to be called the smartest guys in the room when their PT on Intel implies an absurd 8.4x TTM profits. (For those who aren't familiar with the Intel story, note that massive R&D investments into mobile technology are depressing Intel's earnings to the tune of $2B/yr, which works out to roughly 40 cents per share. Goldman's PT thus values Intel's core business at 7x earnings.)
The actual smartest guys in the room on this one are the semiconductor experts at elite consulting firm McKinsey & Company. Their 80-page industry commentary from late 2011 (PDF here) reveals two interesting facts:
- The semiconductor industry contributes disproportionately to growth in labor productivity; it directly led to more than 25% of total US productivity growth from 1995 to 1999. These productivity gains were higher than all achieved in the previous eight years.
- From 1996 through 2009, Intel created $57 billion in value (positive economic profit). The rest of the semiconductor industry, taken as a whole, destroyed approximately $47 billion in value over the same time period. (Granted, some companies such as TSMC (TSM), Samsung, Qualcomm (QCOM), Texas Instruments (TXN), and Applied Materials (AMAT) were value-creative as well, but the combined profit of all six of those companies just matches Intel's over the same time period.)
The landscape has certainly evolved since 2011, with players like Qualcomm taking on bigger roles thanks to the emergence of mobile computing. However, you'll notice that even in this new arena of computing, reviewers place a premium on speed. And the McKinsey analysis is hardly the only study linking semiconductor performance and productivity gains; as merely one more example, a 2009 report by the American Council for an Energy-Efficient Economy concluded that semiconductor-enabled technologies saved consumers $69 billion on energy expenditures in 2006. According to their research, the bulk of energy efficiency gains since the 1970s have resulted from semiconductors, and through 2030, the cumulative net electricity savings from semiconductors may reach $1.2 trillion. Undoubtedly, reports from other industries would demonstrate similar trends.
Performance gains allow computers to accomplish more complex tasks in a shorter period of time, thus enhancing their users' productivity on a per-hour basis. So who got it into their heads that the trend of increasing productivity is suddenly no longer relevant?
Those who claim we don't need faster, more powerful computers are ignoring the very real improvements which the next few years may bring, as well as the fact that shipments of Intel's high end chips are actually hitting record highs despite a "meh" overall PC market. Clearly someone out there likes productivity gains. And on that note, haven't you ever wished your computer would mirror Tony Stark's JARVIS? It would save you a whole lot of time if your computer could interpret you perfectly every time. There's a startup working on that, and they're not alone: Intel is making a big push into perceptual computing. Embedded security is another hot topic, as is V2V communication. If the vision for each of these innovations is to be achieved, we'll need a lot of very powerful chips.
In an economy that could use any growth it can get, it's downright silly to say that current levels of computing power are "good enough" and that the entire semiconductor industry will quickly devolve into a commodity business. So long as smart, creative engineers can think up potential applications of computing power that could materially improve our lives, the current level of computing power will never be "good enough." Anyone who uses those words is ignoring economic reality and the lessons of history.