Nvidia: Fortress Turing Still Impregnable

|
About: NVIDIA Corporation (NVDA), Includes: AMD, INTC, MSFT, TSM
by: Mark Hibben
Summary

AMD's ray tracing no show at E3.

Impressive new games to feature Nvidia ray tracing.

Nvidia can't rest on its ray tracing laurels.

Investor takeaways.

Rethink Technology business briefs for June 14, 2019.

Last August, I called Nvidia's (NVDA) Turing consumer graphics cards an “impregnable fortress”. Now, after almost a year, rival Advanced Micro Devices (AMD) will release the first of its Navi architecture graphics cards on July 7. Although AMD was anxious to point out in its E3 presentation Navi's slightly better gaming performance than the Nvidia RTX 2070, ray tracing performance was never discussed. Despite high hopes and numerous rumors that Navi would at last offer an answer to Nvidia's ray tracing, it was not to be.

AMD CEO Lisa Su holds up the Radeon RX 5700 XT at E3. Source: YouTube.

AMD's ray tracing no show at E3

AMD's supporters, both in its legion of loyal fans and in the tech and business media, have seemed to me conflicted on the subject of ray tracing. On the one hand, it was clear in the immediate aftermath of the Turing release last year that AMD had little recourse but to minimize the importance of ray tracing as much as possible. On the other hand, there has been a frank acknowledgment of the significance of ray tracing and assurances given that AMD would eventually offer it.

This conflict was perfectly captured in an interview given to the Japanese website 4Gamer.net by David Wang, SVP of Engineering of the Radeon Technologies Group last November. According to ExtremeTech, Wang said that

AMD will “definitely respond” to DXR (DirectX ray tracing), but stated that “Utilization of ray tracing [in] games will not proceed unless we can offer ray tracing in all product ranges from low end to high end.”

I viewed this as making a virtue of a necessity. AMD wasn't going to be able to respond immediately with real time ray tracing hardware competitive with Turing at any price point. In my article Nvidia's Impregnable Fortress Turing, I summed up AMD's predicament:

It will be difficult, if not impossible, for Nvidia's competition to develop RTX-enabled graphics cards without infringing on Nvidia IP both in the form of hardware and software. I believe that Nvidia's competitors will be forced to make a choice between licensing RTX technology from Nvidia or being relegated to second tier rasterized games. That's assuming that Nvidia is willing to license RTX, which I doubt.

Turing and RTX are about to create an impregnable position in high-end gaming for Nvidia.

In the past roughly 10 months since then, nothing has really changed, except that the expectations of AMD supporters have been raised to unrealistic levels. These expectations were given a boost by the release in March of the Neon Noir demo by game maker Crytek. This demo featured very compelling ray traced reflections and ran in real time on a humble Vega 56 GPU. This led many AMD supporters to assume that Nvidia's Turing hardware would be unnecessary for ray traced games.

However, a subsequent blog post by Crytek in May revealed the numerous caveats to the demo. Crytek had devised an efficient hybrid ray tracing approach that was only used for reflective surfaces. Conventional game rendering methods could still be used for non-reflective surfaces. Also, other ray tracing effects such as soft shadows and global illumination could not be reproduced in the demo.

Also very telling was Crytek's take on Nvidia's Turing RTX cards:

However, RTX will allow the effects to run at a higher resolution. At the moment on GTX 1080, we usually compute reflections and refractions at half-screen resolution. RTX will probably allow full-screen 4k resolution. It will also help us to have more dynamic elements in the scene, whereas currently, we have some limitations. Broadly speaking, RTX will not allow new features in CRYENGINE, but it will enable better performance and more details.

Many had expected AMD's Navi to feature ray tracing, as in an article by Jon Martindale of Digital Trends in April. Martindale points out that Mark Cerny, lead PlayStation architect, had claimed that the PS5 would support ray tracing. But Martindale, even as he reported the claim, saw what the problem was with it:

There are a few important wrinkles to this story which make it a little less clear cut, though. AMD has previously said it would only introduce ray tracing hardware when the technology could be affordable and viable for a wider audience. The Navi release this year is expected to be Navi 10, the midrange iteration of the technology, potentially culminating in RTX 2070-like performance in the top cards. While adding ray tracing to such cards would make the technology more readily available to gamers on a budget, ray tracing is a notorious performance hog and only performs reasonably well on the most expensive of GPUs with dedicated rendering hardware, like the RTX 2080 Ti.

A more likely candidate for ray tracing would be AMD’s Navi 20, which is expected to be a high-end iteration on the technology, potentially challenging the 2080 Ti in terms of performance. But then the PS5 would be unlikely to use such a GPU, since it would be inordinately expensive and outside the usual cost and thermal limitations of a home console.

Even if it's plausible that Navi 20 will include ray tracing, he's probably correct that Navi 20 wouldn't be incorporated into a console. Yet, Microsoft (MSFT), during its E3 presentation this week, also claimed that Project Scarlett, its next generation Xbox, would feature hardware accelerated ray tracing.

So, one possible answer to this riddle is that Fortress Turing isn't impregnable, but it will take over two years to be conquered. I base this time frame on the release date of Scarlett, which is scheduled for “Holiday 2020”, which I assume means calendar Q4 of 2020.

But based on AMD's own statements, this probably isn't the correct interpretation. Although there was no discussion of ray tracing in the context of the E3 launch of the first Navi cards, the Radeon RX 5700 XT and RX 5700, AMD was not entirely silent on the subject. Buried in a mound of slides given to the press was this one on AMD's “Ray Tracing Vision” (courtesy of Tom's Hardware):

Chris Angelini of Tom's had this to say about the “vision”:

No Ray Tracing Today, And A Tepid Outlook Tomorrow

Navi does not incorporate hardware support for ray tracing in any form. Rather, David Wang, SVP of AMD’s Radeon Technologies Group, told us that existing GCN- and RDNA-based GPUs would support ray tracing via shaders in ProRender (for creators) and Radeon Rays (for developers). Then, down the road, a next-gen implementation of RDNA will evolve to accelerate “select lighting effects for real-time gaming.” AMD’s vision culminates in full scene ray tracing through the cloud. Could the company mean that it sees heavy lifting handled remotely as gamers stream content? We can’t imagine the PC audience would be overwhelmingly receptive to such a proposition. Regardless, AMD believes it’ll be a few years before real-time ray tracing takes off.

AMD and its supporters certainly want it to take a few years because AMD needs to buy as much time as possible. AMD has probably spent this past year getting over their shock and awe and trying to figure out what AMD can do to respond. The chart above is their current answer: not much.

As spelled out in the chart, AMD is essentially abdicating from any effort to build a single graphics card solution for real time ray tracing in games. This goes back to what I said almost a year ago: there's no way for AMD to build such a solution without infringing Nvidia's RTX patents. At least, there's no way unless AMD invests a huge amount of time and money to come up with an independent solution.

And, that's time and money that AMD just doesn't have. Nvidia spent years and billions of dollars developing Turing, and they very carefully and methodically built a wall of patents around their work.

Research and development is a very expensive and time-consuming process. This is something I have difficulty conveying to non-technical readers. The public's perception of R&D has been shaped by movies and television, in which brilliant scientists make discoveries overnight that change the world.

In reality, it just doesn't work that way. It's much less glamorous and more tedious and time consuming. Much of the R&D that companies undertake will never see the light of day, either because it didn't pan out or because it just didn't lead to a marketable product.

I have, in various articles, asserted that AMD is unlikely ever to get to the point of fielding a product comparable to Turing. AMD's chart above is essentially an admission of that. And, what of the claims of Microsoft and Sony (NYSE:SNE)? These can be reconciled with AMD's chart by noting that both intend to offer streaming game services. In all likelihood, the ray tracing they have in mind is part of those services.

Impressive new games to feature Nvidia ray tracing support

One of the criticisms I've had about Nvidia's launch of Turing has been the lack of games that support ray tracing. This has held back the adoption of Turing and allowed Nvidia's critics to maintain that ray tracing is still irrelevant to most gamers.

It's hard to disagree with this assessment. Of the eleven ray tracing-enabled games that were announced last August when RTX was launched, only three are shipping in final release with ray tracing enabled, Battlefield V, Metro Exodus, and Shadow of the Tomb Raider. Two other games use the RTX Deep Learning Super Sampling (DLSS) feature without ray tracing, Anthem, and Final Fantasy XV.

This is an area where Nvidia's management clearly miscalculated, but it has been hard at work to drive adoption. A key first step has been to get the major 3D game engines to adopt Microsoft's DirectX Ray Tracing (DXR) and Nvidia's implementation of it in Turing. This process was already well underway when Nvidia held its Investor Day in March.

Source: Nvidia Investor Day Presentation.

Since then, the list of games at least planning to support ray tracing has grown with some exciting E3 announcements. These include Cyberpunk 2077, Watchdogs Legion, Call of Duty: Modern Warfare, Wolfenstein: Youngblood, Vampire: The Masquerade-Bloodlines 2, and Doom Eternal.

Perhaps, the most anticipated is Cyberpunk 2077 “starring” Keanu Reeves. The E3 Trailer below shows ray tracing to great effect.

Unfortunately, Cyberpunk fans still have a long wait ahead of them with the release scheduled for April 16, 2020.

Perhaps, my favorite for sheer cinematic realism is Call of Duty: Modern Warfare. Although it borrows the title from the popular 2007 Activision release, this is an all-new game. In 2007, Call of Duty: Modern Warfare was a personal favorite of mine as it set the standard for visual accuracy, but the new COD:MW makes the old one look like a cartoon.

COD will thankfully arrive sooner than Cyberpunk on October 25, 2019. With the new announcements from E3 as well many of the remaining 8 games from the original list, there should be about 10 games that support DXR by the end of the year.

With support implemented in the major game engines and increasing developer experience, we should see a steady accumulation of DXR games in the coming years. Increasingly, DXR support will be an expected feature of the very best AAA games.

Nvidia can't rest on its ray tracing laurels

I would agree that Nvidia messed up the Turing launch in a couple of ways. Turing launched without an adequate supply of DXR games, and it launched on hardware that was not quite up to the demands of ray tracing. The result was unacceptable frame rate degradation when enabling DXR.

Nvidia miscalculated the appeal of ray tracing if it thought that the frame rate hit would be acceptable to most players. And, if Nvidia thought that it was in a race with AMD to be first to a ray tracing solution, this clearly was not the case.

On balance, Nvidia would have been better off waiting until it could introduce Turing on 7 nm and enjoy greater game support. Nvidia was faced with a tough decision, and it chose to plunge ahead despite obvious headwinds.

Victory does not always go to the bold, but at least, Nvidia went boldly. While Turing for the PC Gaming segment was not a smashing success, Nvidia managed to salvage Turing by marketing it as an AI inference option for the Datacenter. Undoubtedly, Nvidia has learned from the experience.

The most important thing that Nvidia has learned is that its chief rival has no answer to Turing, and no expectation of an answer. This information might not have been gained in any other way than the Turing release, and this information alone might have made the release worthwhile.

To be sure, Nvidia can't rest on its laurels. Challenges remain in the form of streaming game services and new generation consoles. Nvidia knows that it must continue to drive adoption of DXR by game developers, as well as improve the performance of its RTX platform.

This improved performance may first come from a “super” generation of RTX cards rumored to be released this month. These will probably be incremental improvements that will erase the performance advantage claimed by Navi.

But Nvidia should do more, and this will require investing in Turing production at 7 nm. At Nvidia's fiscal 2020 Q1 conference call in May, CEO Huang did for the first time indicate a willingness to move the product line to 7 nm:

In terms of process nodes we tend to design our own process with TSMC. If you look at our process and you measure its energy efficiency, it's off the charts. And in fact, if you take our Turing and you compare it against a 7-nanometer GPU on energy efficiency, it's incomparable. In fact, the world's 7-nanometer GPU already exists and it's easy to go and pull that and compare the performance and energy efficiency against one of our Turing GPUs. And so that the real focus for our engineering team is to engineer a process that makes sense for us and to create an architecture that is energy efficient. And the combination of those two things allows us to sustain our leadership position. Otherwise, buying off the shelf process is something that we can surely do, but we want to do much more than that.

How soon Nvidia will move to 7 nm probably depends a lot on how Navi performs in terms of energy efficiency, and we won't know that until third party reviewers publish their findings on July 7. If Navi turns out to be more energy efficient than Turing, we can expect Nvidia to move rapidly to 7 nm. If not, Nvidia will probably stand pat and keep Turing on TSMC's (TSM) 12 nm process at least through the end of this year.

Investor takeaways

I actually think it would be a mistake for Nvidia to stand pat, but that's a judgment call that I won't try too hard to second guess. Regardless of the performance of Navi, Nvidia will have to move to a more advanced process for Turing or for a Turing successor at some point.

I tend to base my investment decisions in technology companies on, among other things, clear technological advantage. It's clear that Nvidia has a technological advantage in ray tracing that neither AMD nor Intel (INTC) can overcome in the next few years.

The advantage that ray tracing creates for Nvidia is not due to the fact that it makes games look a little better. This has, with some justification, been derided as an insufficient reason for consumers to adopt the first generation of Turing.

The real advantage is conferred by the fact that developers will ultimately adopt ray tracing and forego the current approach of “pre-baking” lighting effects. Developers will do this because it reduces the amount of effort and, therefore, cost associated with bringing out a new game.

Currently, the cost situation is the worst of all worlds, with developers resorting to pre-baking to maintain compatibility with the vast majority of extant graphics cards. Ray tracing, therefore, becomes an extra cost add-on.

But developers understand the ultimate economic advantage of ray tracing and, therefore, will drive adoption. At some point, when there's a critical mass of consumers with RTX cards, developers will forego pre-baking entirely. When this will occur is hard to say. It may take years.

But it's a myth being promulgated by AMD supporters that AMD is merely waiting until its ray tracing technology is sufficiently mature. AMD has no technology equivalent to Turing and probably never will, as its “ray tracing vision” slide makes clear. Nvidia and game developers that adopt ray tracing exclusively will be the chief beneficiaries of the transition to DXR.

Market sentiment towards Nvidia has been rather negative lately, and while I don't entirely discount market sentiment, in this case, I think it makes Nvidia a buying opportunity. I remain long Nvidia and rate it a buy.

Disclosure: I am/we are long NVDA, TSM. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.