I have been writing lately about the probability of Intel (NASDAQ:INTC) integrating DRAM onto the CPU chip in the case of PC or the Application Processor chip in the case of mobile devices.
For those of you not steeped in the history of the semiconductor business, which would be most of you readers who are not drooling on yourself, Intel started life as a memory company. One of the very first Intel devices was a 64 Bit bit static RAM. That's right 64 bit. We've come quite a way from 64 bit memories to 4 Gigabit DRAM chips. That's about 65 million times the memory of that early 64 bit chip and the price of the 4 Gb chip is about 1/10 of the inflation adjusted price for that old 64 bit chip. There, that is the productivity lesson on the semiconductor industry for the day.
Intel also invented the EPROM (Electrically Programmable Read Only Memory), not many think about the implications of that little invention today. Prior to the invention of the EPROM, if you wanted to use a microprocessor, you would write the operating code, send the code listing to Intel, with a check for $25,000 NRE and an order for 25,000 units (another $150,000) and Intel would send you back a custom programmed ROM (Read Only Memory) that was unchangeable, to be used with the microprocessor. If there turned out to be an "oops" in the code, you would start the whole $175,000 ($1,000,000+ in today's dollars) process all over again.
The impact of the user programmed EPROM was that you could screw up in engineering all you wanted with little financial impact, you didn't have to build 25,000 of an unproven end product, and time to market was much less. The invention of the EPROM drove the usage of microprocessors down to the smallest companies and even individual engineers with a good idea. Suddenly engineering students could learn about and use microprocessors.
Huge deal, this EPROM. Next to the microprocessor itself, I would rank the invention of the floating gate EPROM the most significant Intel invention. The EPROM led to the EEPROM, the electrically erasable cousin of the EPROM. EEPROM led to flash memory, both NOR and NAND flash, and of course NAND flash based SSDs (Solid State Drives) are replacing HDDs (Hard Disk Drives) as I type this article. Neat progression, huh?
The point of all of this is to establish that integrated circuit memory is in the genes of Intel. Huge Static RAMs are now integrated on the CPU chips as cache memory and special Intel-made DRAM chips are in the package with high end i7 processors. The Holy Grail of computing is to have main memory (DRAM) reside on the processor chip.
In previous articles I have speculated that the state of the art in semiconductor technology might only be two process nodes away from having this chunk of DRAM integrated on processor chips. I think I was wrong. It appears that we are in the suburbs of that magic transition right now.
This January 20011 article talks about the physical size of a Samsung (OTC:SSNLF) 2Gb DRAM chip on what is thought to be a mid 30s nm process. The chip size is 35 square mm. Just fooling around with numbers, that 2Gb chip made on today's 20nm process would be about 15 square mm. So, a 4 Gb chip would be about 30 square mm, and 8Gb of DRAM, whether two chips or a single ship should be about 60 square mm. Keep that number in mind.
A mobile Application Processor such at the Apple (NASDAQ:AAPL) A7 chip is 100 square mm on the Samsung 28nm process. Done on an Intel 14nm TriGate process, that chip, perfectly scaled, would be 25 square mm. Perfect scaling doesn't happen anymore, so figure a 14nm A7 chip at about 40 square mm.
Adding the above DRAM, which is exactly the 1GB that the A7 requires, to the A7 chip would result in a 100 square mm chip. That chip should sell to Apple for at least $35. The cost to Intel for that chip should be less than $10 including full depreciation rates.
With RAM on chip, the performance of the above described chip would be off the chart. In a mobile environment we don't need "off the chart", we need low power. This chip could run at much lower clock rate with a dramatic drop in power level while maintaining computing performance. The performance level, whether for compute performance or extreme battery life could now be adjusted in "Settings".
It seems clear that the entry point for memory on-chip is not the PC business, but the mobile sector, where the memory requirements are constant by product and relatively low.
High performance computing (including super computers) are already making the change to in-package embedded DRAM in the case of high end i7s and the new Xeon Phi chips.
PCs with on-chip DRAM might bring up the rear because of the amount of DRAM required and the variability of the amount of DRAM.
I expect to hear all the reasons why compact DRAM cannot be produced on the same chip as logic, but this is Intel we are talking about here. This is the company that brought us strained silicon, High K Metal Gate, and TriGate transistors. A combined logic and DRAM process is probably just a little speed bump for Intel.
In summary, DRAM on-chip would be the pry bar that gets Intel into mobile in a big way.
With mobile DRAM approaching half of the DRAM market, the downside of Intel making on-chip mobile DRAM falls to the DRAM manufacturers. That change would be disruptive to Samsung, Micron (NASDAQ:MU), and Hynix until they could convert DRAM fabs to NAND fabs where the demand should be huge for years to come.
Intel is doing the path finding on near chip and on-chip DRAM and is therefore a near and long term "buy".
Disclosure: The author is long INTC. The author wrote this article themselves, and it expresses their own opinions. The author is not receiving compensation for it (other than from Seeking Alpha). The author has no business relationship with any company whose stock is mentioned in this article.
Editor's Note: This article discusses one or more securities that do not trade on a major U.S. exchange. Please be aware of the risks associated with these stocks.