Entering text into the input field will update the search result below

AI: Nvidia Is Taking All The Money

Summary

  • Nvidia Corporation spent nearly two decades working on hardware and software to be ready for AI’s big moment in 2023. It's no accident they are at the center of it all.
  • But this all puts a huge target on their backs because they are taking all the money. They are stealing margin from customers, and customers’ customers.
  • The resistance to Nvidia is proceeding on 3 tracks: non-GPU hardware, smaller models, and new software.
  • Nvidia’s moat is that they have the only complete hardware-software suite, and have been the default for AI research since 2012. They will be hard to knock off their perch.
  • This idea was discussed in more depth with members of my private investing community, Long View Capital. Learn More »

Chipmaker NVIDIA"s Valuation Passes 1 Trillion In Market Cap

Justin Sullivan/Getty Images News

Nvidia Is Taking All The Money

The main thing Nvidia Corporation (NASDAQ:NVDA) makes are graphics processing units, or GPUs. But the “graphics” part is a bit of a misnomer. What GPUs do so very well is computationally expensive

At Long View Capital we follow the trends that are forging the future of business and society, and how investors can take advantage of those trends. Long View Capital provides deep dives written in plain English, looking into the most important issues in tech, regulation, and macroeconomics, with targeted portfolios to inform investor decision-making.

Risk is a fact of life, but not here. You can try Long View Capital free for two weeks. It’s like Costco free samples, except with deep dives and targeted portfolios instead of frozen pizza.




This article was written by

Trading Places Research is a macroeconomics specialist with decades of experience identifying geopolitical factors that lead to market trends. With a focus on technology, he focuses on where the sector is headed as opposed to where investments are currently.

Trading Places is the leader of the investing group Learn more .

Analyst’s Disclosure: I/we have a beneficial long position in the shares of AAPL, MSFT, QCOM either through stock ownership, options, or other derivatives. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Seeking Alpha's Disclosure: Past performance is no guarantee of future results. No recommendation or advice is being given as to whether any investment is suitable for a particular investor. Any views or opinions expressed above may not reflect those of Seeking Alpha as a whole. Seeking Alpha is not a licensed securities dealer, broker or US investment adviser or investment bank. Our analysts are third party authors that include both professional investors and individual investors who may not be licensed or certified by any institute or regulatory body.

Recommended For You

Comments (106)

N
You are a bear in disguise. What I know is that they are doubling q/q eps in Q2 and they will be trading at 50 P/E. The point here is that even if they substantially reduce EPS growth to 25% q/q that would still be 100%+ EPS growth yoy, which by rule of thumb would justify a PE of 100, ie double price than we have now
c
@Niente Bear in disguise hoping that share price crashes so that they can be bulls… happens to the best of us!
R
I am confused, you say P/E 204 last Friday? I thought it was 64 before earnings and 50 after? Can you help me understand? 😊
R
@Ruger4444 is a DGX H100 the same as a DGX GH100?
Trading Places Research profile picture
@Ruger4444 Forward vs. trailing. I talk about forward later. I was using trailing because I didn’t have fwd data for CSCO in 2000
R
@Trading Places Research I missed that part, I will read closer next time thanks for clearing that up for me!😊
Sunil Shah profile picture
@Trading Places Research
in your table of cost of a DGX H100 setup (AI server like the Nvidia DGX H100), you show the DRAM/NAND input costs of $12,000, a tiny fraction of the total bill of $180,000 or a mere 5%.
This seems incredibly small, esp with Micron saying the DRAM usage on a GPU is 6X that of a CPU data center offering.
I just want to confirm: are these figures correct?
I was under the view that AI was incredbily memory-intensive, and your schedule shows a paltry contribution by DRAM/NAND combined.

Thanks in advance.
Trading Places Research profile picture
@Sunil Shah That’s 2 TB of DDR5 RAM and 30 TB NVMe storage. There’s also the 640 GB of very expensive VRAM on the cards. Not positive but I think that’s SK Hynix.

All that is a lot. The point is even though it is, Nvidia is taking all the money
Trading Places Research profile picture
@Sunil Shah Another example: NVDA is putting 2x Intel’s most expensive CPU in there and look how little that is. In CPU servers, the CPU and RAM dominate the BOM.
Sunil Shah profile picture
@Trading Places Research yup it's Hynix in NVDA. The digital memory content of the final AI bill, is then miniscule.
What a pity as I thought this AI mania would reverse Micron's malaise.
So Micron will have to wait for the hangover in the PC and mobile phone market to wear off.... I'm guessing that's a 2024 phenomenon, and although I harboured some hope that near-term weakness in DRAM ASP's would be offset by AI surge.

Appears NOT.

Thanks for insight!
F
Nvidia to me is more like Apple instead of Cisco. Nvidia stopped being a chip company a few years ago. Nvidia software made demand for higher end GPUs with technology like ray tracing and DLSS. Nvidia software created CUDA which is the preferred language of AI and created demand for AI accelerators. Nvidia software is what will keep people coming back for more. This is not just for software features but more importantly software quality. Software quality is key and Intel and AMD cannot compete.

My best guess Nvidia will come out with multiple flavors of the AI accelerators the same as they do for the gaming and pro-viz markets. This will reduce margins probably to the 65% level. Nvidia has a 2 year lead in HW tech. My best guess is that from the 95% market share will reduce to 80% similar to the market share seen in gaming and pro-viz. This assumes Nvidia continues executing with excellence and does not become complacent.
R
@FreedomForLiberty Trying to think about what to call NVDA now, you are right they not just a chip company any more, not sure what to call them? Maybe someone will think of a name using the letters from Nvidia, Microsoft and Partnerships!
J
@FreedomForLiberty
Everyone is focusing on Nvidia and their valuation. Everyone says competition will come and Nvidia can't keep 95%.
I actually agree with all of that but what nobody looks at is the possible TAM.

What if Nvidia's market share will be 50% or less in 10 years but revenue 10-15x of today?

Not possible? Look at Apple and iPhones. Who could have predicted 15 years ago that Apple would have 200b revenue on the iPhone alone in 2022????

If Nvidia's revenue rises 10-15x in a decade isn't the valuation then justified if not even cheap?

And yes, that's the risk. High chances have high risks but maybe people buying now are considering the TAM despite competition?
prudent 576 profile picture
Saying Nvidia is taking all the money is akin to saying the rich aren't paying their fair share.
T
@prudent 576 hardly so! Don’t you have B. Sanders of chip companies?
Sunil Shah profile picture
Great Article and insights. Thanks.
You write
"So what is everyone doing about it? So far, three things:
Hardware: “AI accelerators” are alternative hardware that can do the same work at much lower cost."

So how complete is the AMD solution with their
MI300 Accelerator offering?
Is the potential cost saving large enough for an AI project to be motivated in favour of AMD?

Related question, I agree with you that Nvidia has cornered the AI niche with their complete offering, both on hardware and CUDA software .
But is the AMD solution via RocM (software) getting close to being a realistic contender ?
It's obviously NOT in the interest of the large CSP's to be entirely dependent on Nvidia, as they are now.
So one would think the CSP's would foster and nurture AMD as the best hope as an alternative.
Are there signs that AMD is making headway here ?
If so what are they ?
Thanks in advance.
Trading Places Research profile picture
@Sunil Shah No headway that we can see above water, but there is something happening beneath the surface. I’m pretty sure MSFT and AMD are getting cozy, and we’re going to see something out of that soon. FWIW, the people I talk to still prefer CUDA.

Keep in mind, these are all recent developments since large models began moving into production, and the pieces on the board are shifting. Like I said at the end, the pace of developments is staggering.
Sunil Shah profile picture
@Trading Places Research ok thanks for that, great article again - well done!
t
If AI is going to be that powerful, no technological moat is safe. AI makes the future extremely uncertain, so pricing in such massive NVDA earnings growth is insane.
H
The analogy to Cisco makes way more sense to me when you look at the period of 1990-2000. That was the decade in which Cisco ruled the tech landscape with no competition in the networking sector and their stock split 9 times in that decade. This is what NVDA now enjoys but for how long of course is the question.
d
@Happy Jack
Exactly, NVDA will have its run in the sun before losing share. The question is how long that will last. At this point, ride it up and watch the landscape below for competitors to advance.
That timeframe is measured in years.
Josh Borenstein profile picture
@deeminimus Echoes my own investing thesis on $NVDA. Once competition finally does catch up to them, at that point it will be time to sell and start reducing my position. But that’s years away.
productivityLeaps profile picture
I'm quite frankly surprised by two things. First, the bullet points provide a good, not great, synopsis of NVDA's LONG-TERM investments in accelerated compute towards AI. I'm told that for at least a couple and probably more than a couple quarters at the outset of Jensen's commitment to CUDA, that he was chastised by Wall St. for "not making the quarter."

You're the first author on SA to identify the significance of the duration of this long-term investment which guys like Milton Friedman and Jack Welch made canon on Wall St. At least Jack tried to rebuke himself albeit waaaay too late when he was nearing death, it was reported.

Second, why would you characterize NVDA as a thief? Isn't it expected that the company should earn a return which to you, others, appears outsized margins? After all, they spent ~20 years and who knows how many dollars and engineers' time nurturing their investment.
J
IF YOU BUILD IT, they will come. Is that really so hard to grasp?
Usta Stockman profile picture
At 35 x sales, they have already taken all the money.
This stock has been bid up for the next two generations.

Asinine valuation ( bid up on ai hype )

Would strongly advise all to take profit
productivityLeaps profile picture
@Usta Stockman How many shares of NVDA did you purchase and when?
H
@productivityLeaps He's short of course.
M
Harsh truth is they simply do not make enough money for their current valuation.

The common misconception is that AMD and Intel are not also working on AI super-compute hardware.... They Are! Remember when they said AMD would never overtake Intel in server compute? Well their MCM packages are going to be a thorn in Nvidia's side for a very long time (just like to Intel). Nvidia still is doing monolithic architectures that just won't scale in compute via moore's law. Software is currently their biggest saving grace.

Nvidia will do well as a company but the current valuation is not justifiable by any means.

Sales should minimum $100B (~4x current amount) to even get close to the $1T market cap.
J
@Marco A Have you tried oil companies?
G
Maybe I'm just old, but I'm hesitant to put my money in tech stocks now days...feel more comfortable investing in healthcare.
J
@Gregmax Too bad GOVERNMENT is involved. Of course, it always sticks its finger into profits.
S
I was hired out of graduate school by IBM. So I know they're downfall. Actually a good part of it was the FTC.
The FTC came after IBM with force for their dominance in mainframe and office products printer division. The FTC was going to split IBM up. The FTC forced IBM in a secret agreement to slow down innovation for 10 years. Very people few people are aware.
During those 10 years DEC and DG we're innovative in mini computers disrupting IBM's mainframe model. Then it was Apollo servers disrupting the mini computer model. You could add many other innovative technology companies into the mix.
There are a lot of bright people in the world of computing.
A little more than 6 1/2 years ago I began looking for the next breakthrough in technology.
I did research for over a month.
You're very correct in 2005 Nvidia began Research in AI for GPU's and AI CUDA software.
Nvidia was providing funding for AI Research and Development to Scientist and Funding AI Research in Universities.
AlexNet received their funding from Nvidia $$$
Nvidia also provided free GPUs and CUDA software to the Scientist and to the Universities.
Nvidia provided resources of Hardware and Software Engineers for co-Development at no charge.
In 2016 Nvidia released the Industries first AI tensor Core GPU. Nvidia is now in there 4th Generation of AI Tensor core AI GPUs.
In 2016 Nvidia build their first AI Supercomputer on an assembly line. The only AI Supercomputer available today in 90 days that is built on iassembly line.
Jensen Huang in 2016 personally delivered the first AI Supercomputer to OpenAI. Jensen personally signed the AI Supercomputer.
You can Google it.
It was a gift free of charge to OpenAI for the development of Large Language Models / Generative AI.
OpenAI developed ChatGPT on Nvidia AI Supercomputer with CUDA AI Software
Nvidia also provided Hardware and Software engineers to OpenAI for co-Development.
Nvidia also provided Funding to OpenAI for development $$
Nvidia started building relationships with Scientist and Universities in Research and Development in AI in 2005 around the world.
No other Computer company no other Software company was doing this at that time.
That's why they're so far ahead. Nvidia's is in the DNA of LLM'S, ML's and Generative AI.
Co-Ddeveloped all the way.
Nvidia help create this groundbreaking research and development with Funding $$$
Their relationship is very deep with everyone in the AI community around the world.
Neither Intel or AMD have an Artificial Intelligent Tensor core GPU's. Their GPUs are plain Jane accelerating GPUs not based on Artificial Intelligence.
Neither Intel or AMD have any Artificial intelligence software or AI Software Platforms.
Just started developing software stacks.
Just started developing vertical market applications.
That takes you years of development and a plethora of engineers.
As I said Nvidia already has a plethor of engineers they've had them for years. They have software engineers then hardware engineers.
Computex 2023 Jensen's an announced they have $3 million developers of Cuda software.
60% of the largest Companies in the world are already using Nvidia's Omniverse.
Nvidia has built it all into a Giant AI Moat intertwining AI Hardware AI Software and AI systems with CUDA software.
So your statement is inaccurate that Intel or AMDs accelerators are equal to Nvidia.
As I said they're plain Jane dumb GPUs.
Nvidia AI Tensor cores split the work into separate tensor cores on their 4th generation GPU.
It's controlled by AI CUDA software.
Competitors are about 18 years behind.
Nvidia formed a separate AI Division in 2006 for AI GPUs and CUDA software
Nvidia today has more software engineers than they do hardware engineers.
The new H100 GPUs cost $36,000 each.
Not an inflated price That's the list price
The older A100 GPUs 15,000 each.
Nvidia is now shipping their new revolutionary Grace / Hopper combo CPU / GPU based on ARM IP. Nvidia announced patented low power technology developed from the smartphone industry. Nvidia introduced a new Revolutionary Ethernet switch. Broadcom's Ethernet switch called Tomahawk just got scalped by Nvidia
If you like to learn about Nvidia's Announcements at Computex 2023. Jensen Huang was on honored.to give the Keynote speech.
Jensen's presentation was 2 hours long of breathtaking revolutionary introductions by Nvidia. Make sure you change in YouTube settings to 4K video.
https://youtu.be/i-wpzS9ZsCs

I'll end with after doing my research 6 1/2 years ago.
I purchased a boatload of Nvidia stock. It was my largest holding at that purchase and still is in my IRA. I haven't sold a single share.
My career was in the OEM computer hardware software and system business.
A little easier for me to understand over the horizon the future of technology.
I was fortunate.
My average price today with the 4-1 split July 2021 is $29.
I'm up 1,250%.
I have more convictions today than I did when I purchased a 6 1/2 years ago
Yes someday Nvidia will get taken out.
It won't be Intel and it won't be AMD. They're 18 years behind in all areas. Intel invented the x86 around 1973. There's been modifications by AMD and Intel.
But it's based on old technology.
It will be a new technology a new breakthrough then the torch will pass. AMD it's been attacking Intel with superior CPUs for over 5 years. They're only at 30 market share in data center while Intel still maintains leadership in PCs.
Next year AMD increase its market share in data center to 40% they've done a great job.
My point it doesn't happen overnight particularly when you're intertwined with hardware software semiconductors and ....
Protecting your moat.
Good luck in your investing
p
@SirLiberte Are u a bot ?
J
@SirLiberte I bought the old IBM OS. I could still slap their DOS on onto one of my computers for the very reason you mourn it. It simply didn't win. Yes, through government contracts. My God, it was about 1990! Let's do a George Harrison album, "All Things Must Pass."
The reason our air traffic control systems and OTHER things break down is because no one wants to advance coding. You get PERMISSION to freeze the old ways in place by GOVERNMENT AGENCIES. Welcome to the STONE AGE!
S
@Jeff_in_NM I totally agree back in the day when I was hired out of grad school by IBM. I was a GEM specialist.
Government Education Medical in Boston. All the Federal top agencies all the Top Universities and all the Top hospitals in Boston.. including the Massachusetts State House and the Massachusetts Department of Education.
Once you're in you're in.
Almost need an atomic b... To move on to new technology. I totally agree with your assessment.
Good luck in your investing
Joe Psotka profile picture
Where is non-von Neumann computing in all this. Nvidia GPUs are great for parallel computing but they don't do SNNs like memristors and Intel's Hawaiian complex. Intel may well come back when SNNs start to model large language systems.
PleaseJustNo profile picture
@Joe Psotka The only other forms I have heard of are optical and analog, both of which can do neural nets, but neither of them has as of yet progressed out of the lab. Quantum computing too might assist with faster model development at some point.
S
@Joe Psotka Not only are Nvidia's GPUs AI tensor cores now in their 4th generation specifically designed for AI machine learning generative AI
They are parallel processing GPUs with AI technology
Nvidia's main CUDA software operating systems and Nvidia's AI vertical market operating systems and AI applications. All their software has been designed and built for parallel processing. No other competitor has this.
Their new Grace Hopper Revolutionary GPU CPU is based on ARM IP.
ARM IP parallel processing technology.
Fujitsu was the first a string thousands of Arm CPUs process together in 2020.
It became the fastest supercomputer in the world.
Fujitsu Supercomputer 3 years later is still the 2nd fastest supercomputer in the world. You can Google it.
Back in the day I worked for Fujitsu in Silicon Valley.
Nvidia new Ethernet switch Series X is a parallel processing Ethernet superswitch.
Net net whether it's GPUs, CPUs combo CPU GPU Grace Hopper or CUDA Software of any kind as it was designed for parallel processing.
What does that mean You can line them up to Infinity in a thousand, 10,000, 100,000 with CUDA combination.
The CUDA OS Platform or System OS / operating system Software. The Application AI software will will look at it is one CPU one GPU one DPU one Nic card and one Ethernet superswitch. Not as individual GPUs CPUs DPUs Superswitches. They can be upgraded in a drawer in a sliding rack. The system will automatically configure.
Intel in AMD are built on x86 which is from the early 1970s with improvements.
It's a two-lane highway of X's and O's. They can't do parallel processing at all.
The system and software will not look at them package together as one. Software will look at them individually they're not based on parallel processing. The software and system will look at them as individual switches.
Broadcom's tomahawk series 5 supercomputer switches cannot do parallel processed The basic dum switches.
The new superswitch series X by Nvidia is revolutionary InDesign combination with CUDA software for parallel processing for management balance load for high speed acceleration throughput.
AjitMD profile picture
NVDA has been developing AI hardware and software solutions for 15+ years. They have these solutions ready. Competition will take time to develop… years. Meanwhile, end users need solutions now or they left out. NVDA sales are $11B 2Q. H100 solutions are sold out for a year. We will see significant sequential growth going forward.
J
@AjitMD
Yes and H100 pricing is like 2-3x of A100. So Nvidia's DC revenue from 2020-2021 will already grow 2-3x if they only sold the same amount of units. But if the unit amount also grows then you can only guess what DC revenue to expect...
S
Great article! I would find it hard for Azure or AWS to skimp on computational ability for accelerators. That would be a hard pitch to customers. However, as you point out, not everyone needs that power. We’ll see. What do you think the NVDA will climb to?
Trading Places Research profile picture
@StateStreetBoston Thanks! I have no opinion on the price until it stops being a meme stock. I stay away from things I don’t fully understand.
S
@Trading Places Research Nvda a meme stock? That’s a first
Doctor_ECE_Prof profile picture
@Trading Places Research
"I think Nvidia has to make an AI accelerator to protect that flank eventually, even if it undercuts their margin and growth. If they don’t do it, someone else may do it for them."
Amen. I can add another example for what greed, locked into one lucrative path, complacency and change in general direction can do. "The once great DEC and their sought after VAX computers and their own proprietary VMS OS". When computers made with microcomputers along with free open source UNIX (incidentally developed by DEC researchers) arrived on the scene, DEC half heartedly made the MicroVAX but the bean counters tried to preserve the cash cow VAX by stalling on the microVAX. Result, the demise of DEC.
Trading Places Research profile picture
@Doctor_ECE_Prof The first computer I ever used was a PDP-11, which my public school system had bought, and no one knew how to use except me and a few others, so we had it to ourselves. DEC really blew that.
Doctor_ECE_Prof profile picture
@Trading Places Research
I was working on my Ph.D. in signal and image processing, my thesis advisor got one PDP 11 and gifted it to me! That was great to move away from the main computer (CRONOS OS?) with those deck of cards, learning to program highly computational problems onto that 64 KB (yes, K, not M or G) memory swapping in and out.
In my career as a Professor and researcher, I saw only fewer and fewer kids like you or your friends. Most have become fearful of or hate hands on, geeky work.
Trading Places Research profile picture
@Doctor_ECE_Prof Well, language models are going to write all the code anyway ;)
d
Excellent article! Thank you very much for helping us non-techies understand the progression of AI. Interesting Hype Cycle chart.
From the Oxford Dictionary, ‘hype’ as a verb, “promote or publicize (a product or idea) intensively, often exaggerating its importance or benefits.”
With so so many applications, the most important to business enterprises being reductions of headcounts, I think the ‘hype’ angle is overplayed and the progression of AI is more of a paradigm change. JMO
Although I’m holding some shares of NVDA, I’m actually more interested in the builders of the machines that make the chips work.
For now, I’m holding more of SMCI with a lower float and the possibility of them splitting, following the same path as Dell in the early 1990’s. www.reuters.com/...
We’ll see.
Trading Places Research profile picture
@deeminimus Thanks!

1. I view AI as like the internet — a 1-time rolling shift in everyone’s margin profile. Mostly better, but worse for some.

2. On SMCI, I am undecided. Scroll back up to the DGX-H100 cost breakdown and look at “case/mobo/etc”. That’s their piece.
d
@Trading Places Research Yes, I get that.
SMCI has been partnering with other AI chip makers in the space also, leading to a wider customer base with machines customized for them and spreading out risk.
Trading Places Research profile picture
@deeminimus I only started thinking seriously about them recently, so I don’t have an opinion yet.
S
Glad I bought it 6 years ago.
Disagree with this article? Submit your own. To report a factual error in this article, . Your feedback matters to us!
To ensure this doesn’t happen in the future, please enable Javascript and cookies in your browser.
Is this happening to you frequently? Please report it on our feedback forum.
If you have an ad-blocker enabled you may be blocked from proceeding. Please disable your ad-blocker and refresh.