I need to reiterate that DRAM on-chip is only one speculation of what could be coming next from Intel. Over the past four years, Intel has built much more capacity than could ever be filled by its traditional x86 CPU business. The common wisdom is that the over-building was simply a mistake. The problem with that is that the spending for new facility continues long after it was recognized that the PC business was declining significantly.
Clearly, to me at least, this overbuilding was, and is, not a mistake, but some part of a plan for substantial revenue growth for Intel.
With tight secrecy on specific plans or products, a guy like me must deal with speculation and the knitting together of hints by the company.
Subsequently, another Seeking Alpha writer, Jaret Wilson, has suggested that DRAM-on-processor would produce an expensive, low-yield chip that would not be commercially viable.
Mr. Wilson's article was primarily about Micron (NASDAQ:MU) and the cost of DRAM. Much of that cost analysis was not based on operational factors. Instead, it was based on the low value of the Japanese yen. Now the value of the yen has certainly been a big factor in the success of the Micron story, but to predicate the future success of the company on currency exchange rates doesn't seem like good methodology.
Mr. Wilson has apparently taken my place as Seeking Alpha's Micron cheerleader, and the slightest possibility of Intel producing chips with monolithic embedded DRAM could be a major hit to the DRAM business and Micron and the two other DRAM producers.
Mr. Wilson points out that Micron and Intel are collaborating on the HMC (Hybrid Memory Cube) as if that would prevent Intel from manufacturing their own DRAM products. According to Micron investor relations, the HMC is not part of the "emerging memory technologies" joint venture and that any HMCs bought by Intel would be bought at market price.
OK, let's see whether I am crazy or not.
My speculation is predicated on mobile application processors being the first place that on-chip DRAM might appear. Some of the reasons are that the DRAM requirements of mobile APs is small and constant. For example, to date all Apple (NASDAQ:AAPL) A-series chips have used 1GB (8Gb) of system DRAM. Also, including a very wide I/O DRAM on chip could be very silicon efficient and improve performance to the point that the clock rate of the A chip could be lowered, thus reducing power dissipation. We would get kind of a twofer, better performance at lower power.
So let's see what the magic of Moore's law could bring to this situation.
I will use the Apple A7 chip as the example here simply because the A7 chip size and approximate price from Samsung is well known. Any benefits of DRAM-on-chip would certainly accrue to an x86 based chip.
The A7 chip is said to be about 100 square mm in size built on the Samsung (OTC:SSNLF) 28/32nm process.
What happens to this size chip when built on the Intel 14nm TriGate process? From 28/32nm to 14nm is 50% linear shrink that would make the chip one-quarter the area of that 100 Square mm, or 25 square mm.
You don't need to be a genius or an engineer to figure this out, you only need to have not been kicked out of 6th grade arithmetic. If a square of cardboard 20 inches on a side is reduced to a square of cardboard only 10 inches on a side, the area of cardboard goes from 400 square inches to 100 square inches. The same thing happens to that little silicon chip.
Now to head off some of the criticism I need to point out that not everything on that silicon chip shrinks and some think the 32nm to 22nm TriGate node transition was less than a full one-third linear shrink. Given all that, I will correct the 14nm chip size up from 25 to 40 square mm.
Now for the 1GB (8Gb) DRAM section of the chip.
In 2011, Samsung made a 2Gb DRAM chip on 35nm process. The size of that chip was 35 Square mm. We can assume that if that chip were made on today's 20nm process it would be about 17.5 square mm. We can further assume that a 20nm 4Gb chip would be 35 square mm and that an 8Gb chunk of 20nm DRAM would be about 70 square mm.
Now, here's something to think about: Today, there are many memory interface drivers and bonding pads on BOTH the DRAM and AP chips. Integrating the DRAM onto the AP would eliminate all of the silicon used for those structures. The effective size of the DRAM section without the above mentioned structures might be only 60 square mm, or even much less.
So, the combined size of an A7 equivalent chip and 8Gb of integrated DRAM could be back to the present A7 chip size of 100 square mm. Performance improving ultra wide memory I/O could be added to that chip for virtually no silicon (cost) penalty.
The A7 chip is thought to cost about $17.50 from Samsung and the mobile DRAM is about $10.50. That DRAM-on-chip AP should be worth $28 to Apple… even more since Apple invested in the fab in which the A7 was built. I think $30 would be a fair price for such a chip.
So we have, for Intel, a very modest size chip that can be produced at very high yield. A 300mm silicon wafer would have space for about 580 of these chips, about 500 of which would test out to be good parts.
The cost of that wafer in a new 14nm fab would be about $5,000. The selling price of that wafer would be $15,000. That is a gross margin of 66% and a very compelling chip from a performance and power standpoint that could sidestep any future shortage of discrete mobile DRAM chips.
One thing you can take to the bank is that Intel will find a way to fill what amounts to about $30 billion worth of bleeding edge silicon manufacturing capacity.
So, you see Mr. Wilson, we don't really care if nine metal layer logic wafers are being "wasted" building three metal layer DRAM on half of the chips. If that is what it takes to sell Intel mobile chips against Qualcomm (NASDAQ:QCOM) or anyone else, it is an inconsequential price for Intel to pay.
To use Mr. Wilson's metaphor, building one Hyundai for every Rolls Royce on a wafer would seem to be equivalent to building all Mercedes Benz wafers.
As I said in the beginning of this article, DRAM-on-processor-chip is only one of the speculative technologies that could be a game changer for Intel and the industry.
I am very long Intel and intend to stay that way.
Disclosure: The author is long INTC. The author wrote this article themselves, and it expresses their own opinions. The author is not receiving compensation for it (other than from Seeking Alpha). The author has no business relationship with any company whose stock is mentioned in this article.
Editor's Note: This article discusses one or more securities that do not trade on a major U.S. exchange. Please be aware of the risks associated with these stocks.