NVIDIA Corp. (NVDA) Presents at 4th Annual Needham Automotive Tech Day Conference (Transcript)

NVIDIA Corp. (NASDAQ:NVDA) 4th Annual Needham Automotive Tech Day Conference Call June 3, 2020 11:00 AM ET
Company Participants
Danny Shapiro - Senior Director, Automotive
Conference Call Participants
Raji Gill - Needham & Company
Raji Gill
So, welcome everybody. My name is Raji Gill from Needham & Company. We're very pleased to have NVIDIA presenting with us at our 4th Annual Automotive Tech Conference. With us is Danny Shapiro. He's the Senior Director of the Automotive Division at NVIDIA as well as Stewart Stecker Director of Investor Relations. Danny is the Senior Director of Automotive Division at NVIDIA and he is a 25-year veteran of computer graphics and semiconductor industries and has been with NVIDIA since 2009. So, it’s great to get his expertise and insight. The format of the presentation will be about 20 to 25 minutes or 30 minutes on NVIDIA side, and then we'll open it up to questions 10 to 15 minutes. I'll kick it off. If there are any questions that investors have, there's a chat box in the bottom of your screen, and you can write a message and I'll ask the questions on your behalf.
So, with that, let me hand over to Danny.
Danny Shapiro
Great. Thanks so much. It's really pleasure to be with you today.
What I want to do is spend a little bit of time and just give you some insight into the work that NVIDIA is doing in the automotive industry. And again, everything here is subject to usual financial disclosures about forward-looking statement.
So, NVIDIA of course started out as a gaming company, and that's still a very big part of what we do and there's great growth there. But, as you know, AI is transforming every industry and NVIDIA is sort of at the heart of that AI revolution. There is really two aspects to AI. There is first, the development of the AI, the training, and we play a very active role in that, and our data center business is innovating in that space. And we just had new announcements with our new Ampere architecture, and data center growth is enormous. But then, there's also the aspect of what we call edge AI or also inference where sensors are taking data in and the AI has to understand the environment. And it's a continual cycle of reasoning and acting and planning.
In the automotive space, we have solutions that are designed to enable autonomous vehicles, and creating the brain of an AI system. And so, our DGX systems are used in the data center and it’s exact same architecture than that goes in the vehicle in a platform specific on what we call NVIDIA DRIVE AGX, our autonomous processing platform for vehicles and robots and even healthcare. And so, again, we have a full end-to-end system that we've developed to enable everything to ultimately be autonomous. We believe this will be the case. Varying levels of automation and autonomy will be across cars and trucks and all kinds of robots and delivery vehicles and fast transportation as well as specialty vehicles, so agriculture and mining as well.
The key thing here that we've developed is not just about a part that goes in the car, but it's a full end-to-end platform. It starts with data collection. So, whether it's NVIDIA or customers that are out there, it’s human driven vehicles, as well as automated vehicles that have sensors on them that are collecting data. It's used for mapping, it's used for training. There's a massive amount of information that needs to come in for us to then be able to train on how to recognize lane lines, recognize pedestrians, recognize street signs. It all comes down to many different neural nets that are running in the vehicle. And each of those neural nets is very-compute intensive and needs to be trained on mass amount of data and continually refined and updated.
Before we actually put it on the road, though, we want to test and validate. And this is where simulation plays a very large role. So, we’ve developed our Constellation platform that allows us to do full hardware in a loop and software in a loop testing and validation before it actually goes on the road.
And then, we have our platform for inside the vehicle. And again, this is the DRIVE platform that is running different software stacks. We have our DRIVE AV, which is the autonomous vehicle stack that's doing all the sensing, perception, mapping, localization and task planning. And then, our DRIVE IX software stack, which is our intelligent experience, which will leverage our sensors inside the vehicle, maybe a driver or passenger facing cameras to do driver monitoring, detect if somebody's drowsy or distracted, and that will integrate with the outside sensors to enable new safety features or convenience features.
So, again, this is the industry's only platform that goes from end-to-end, a single architecture that the software that's been developed and trained on NVIDIA is then running on NVIDIA inside the vehicle.
As I mentioned, there's a variety of different deep neural networks that need to do running inside the cars. And so, we've developed dozens of these that we make available to our customers, and our platform is open so they can build their software on top. So, these different neural networks are taking the camera or the radar, the LiDAR, the ultrasonic data, and are trained to detect objects, or to then also understand what's free space, which is basically opposite of detecting objects. It's where is it safe to drive? We can then run algorithms on these to understand the distance, to detect different weather conditions, different road congestions, or we can do math localization, cameras with LiDAR, we can plan pads. We can anticipate trajectories of other vehicles. The list goes on and on. And this is really where the crux of the challenge is that we're developing software. Our partners are developing software because we want to be able to ensure that these vehicles are safe on the road. So, again, a diversity of different kinds of algorithms and redundancy is key to ensure safety.
To that end, again, we want to be able to test and validate everything. And so, we've developed our DRIVE Constellation. This is a two-server unit. One server is generating the environment in virtual reality, is generating the synthetic sensor data, based on real world experience. And so, we’re simulating all those sensors. And then, we're feeding that into the second server, which is the DRIVE AGX computer. It's operating the full software stack as if it was driving on the road, but it's in a virtual or simulated environment. And then, the commands or driving the steering, acceleration and braking are sent back to the simulator, and we're able to do this 30 times a second.
So, this is a full hardware in the loop simulation that lets us take a vehicle that might only be operating a couple of hours a day and now operated in the cloud 24/7. And so, this is a way that we can then rack up all of these servers. And this will be a whole fleet of vehicles that again are not just driving on the roads where testing is essential on the roads, of course, but usually those are boring tests, nothing happens, a good thing of course. But, really, what we want to be able to do is test those really dangerous, challenging hazardous scenarios. And they often won't even occur after days and days or months or years of testing.
And so, in simulation, we can have these edge cases or corner cases as they call it. Kids running out in front of cars, people running red lights, blinding light at sunrise or sunset. The kinds of things you don't encounter very often, we can put them in simulation and test and test over and over. And so, the ability to do this enables our partner companies to test and validate and basically recreate and run regression over time to ensure that the software is touching all these corner cases.
The other thing of course as part of our end-to-end solution is the software stack. So, we've developed a mass amount of software where the base platform that goes in the car, software application on top tools, libraries, deep neural networks, and our customers that are able to build their application. So we have our driverless software. At the core layer, we have our drive AV, autonomous vehicle, and DRIVE IX intelligent experience software stack. And then there's all kinds of libraries in our DriveWorks for the sensor processing for mapping and localization and path planning. So, full software stack is open to our customers to build their applications on top.
Of course, then there is the hardware that it runs on. Our DRIVE platform is being used by hundreds of companies from car makers to truck makers to shuttle companies and robotaxi companies. And it's a single architecture that scales. And each new generation of our technology is backwards compatible. So, it's a seamless transition for the software. We recently introduced NVIDIA Orin. It's our newest SoC, or system on a chip. It's a 17 billion transistor chip. Inside it has many different processors. There’s ARM CPUs, there is a new Ampere GPU, there's programmable vision a1ccelerators and deep learning accelerators. Again, this is a whole computer system on a chip design for autonomous vehicle. So many different types of algorithms are running simultaneously providing diversity and redundancy required for safe operation. It’s seven times performance boost over our previous generation. So again, we've been able to make it more energy efficient and increase the performance dramatically.
Now, with this new architecture, with our Ampere architecture on the GPU side, it’s much more energy efficient. And now, we're able to broaden our offering. Again, the key thing here is a single architecture. Our customers that we've been working with on developing higher levels of automation, Level 2 plus to Level 4 and level 5 robotaxis are asking us for help at that entry level segment.
Traditionally, they've been working on a totally different architecture and different vendors, developing front facing cameras. And it really is too expensive in terms of dollar, in terms of manpower and engineering. So now, with our new Orin and Ampere architecture, we can span the entire range from the entry level with just a simple ADAS solution, a single camera front facing solution, a new Orin entry level will operate at just 5 watts, so energy sipping there, delivering 10 tops of performance, so 10 trillion operations per second to be able to do front facing ADAS solution, basically a five-star end cap type of offering. But, based on the same infrastructure, training for AI and software stack, that will span all the way over to our new 2,000 tops performance at the high end for robotaxis.
So, again, for the first time, an automaker has a single architecture, single software development effort, they can now put this solution in every single vehicle in their lineup and leverage that unified engineering approach. And again, this is software updatable. So over time, they can add new features, new capabilities. This opens up new business models for these automakers as well. This is something that's really key and a huge inflection point for the industry.
So, at that entry level, again, a single scalable architecture, but a small front-facing camera unit mounted in the windshield with our entry Orin, our ADAS solution, it has the Ampere GPU, and it will deliver incredible 10 trillion operations per second at just 5 watts of power.
At the other extreme, now we're combining two of the high-end Orin SoCs, each of those able to deliver up to 200 tops for its two new Ampere GPUs, so grand total of 2,000 tops able to run many different applications simultaneously. It delivers now 6 times the performance of our current Pegasus offering, which what is in development for many of the robotaxis companies and again greater performance per watt. So, energy consumption has gone down dramatically.
What I want to do is show you a little video clip now. Over the bandwidth there, you might not see full frame rate? But, this is a recent drive in one of our tests vehicles. We have a fleet -- a small fleet compared to our customers, but we need to understand the complexity and the full challenge ahead of our customers. And also then, we're able to develop our hardware and software by understanding the full complexity of the problem. So, you can see this is our new headquarters called Endeavor. And I’m going to go ahead and play this video. You can see one of our test vehicles on the road.
[Audio/Video Presentation]
Again, there is obviously a safety driver inside the vehicle, but it’s doing all of the sensor processing, there is a neural network for detecting lights, for detecting lanes, for detecting other vehicles. So, driver doesn't touch the steering wheel and he's there to take over if he needs to and the system is able to detect everything on the road.
So, I’m not going to play the full video. This goes on for several minutes. But we're able to now be driving all around the roads of Silicon Valley as we do tests. But, tests are currently on hold right now due to the COVID-19 situation. But the benefit that we have again in our end-to-end solution is that we shifted to doing all of our testing in simulation in the cloud.
And so, I want to advance to the next clip, and this will show the same kinds of drives, but now fully in the simulator. So, leveraging the technology we have from a graphic standpoint, as well as modeling the physics, physical dynamics. And now, all the sensor processing, we’re taking simulated inputs into our software we're running. And we're doing all the same kinds of detections on the simulated data and can test and validate that the complete system actually works, before it goes on the road. And so, our engineers are at home working and being extremely productive, even while in stay-at-home orders.
So, our ecosystem that's leveraging our technology is vast. There are hundreds of companies that are building and testing and deploying their vehicles on NVIDIA DRIVE. And we’ve made announcements of course with big OEMs like Mercedes Benz and Volvo, Toyota. In China, we have a number of auto makers as well. The trucking industry is doing a lot of development work. In fact during recent pandemic months, we've seen a shift from there is less people out on the road. This ride-sharing has gone down. So, the robotaxi business may look at risk. The reality is it's still very healthy, long term, but we've seen a pull in of delivery and trucking needs as well. So, a lot of long-haul trucking, a lot of last mile delivery activity will be going autonomous.
We work with hundreds of startups and software companies, as well as the mapping companies, the sensor companies, the simulation companies, it's a very complex ecosystem. There's a lot of different players and they're all developing on NVIDIA DRIVE platform.
So, why do people come to NVIDIA DRIVE platform? I think, there's several key things I talked about. I just want to wrap up here before we go to questions, but some of the key differentiators that we have. The first again, it's an end-to-end platform. I think that term is used a lot. But really there's nobody else in the industry that has the complete breadth that NVIDIA has that starts in the data center with the data coming in the collection, the archiving, the curating of that data, it's all on NVIDIA. Even our competition is using NVIDIA in their data centers.
So, the algorithms are developed on our architecture. There is the testing, validating to simulation on the same architecture, and then it goes into the vehicle. So, it's seamless. There's no porting of code. It's really a huge advantage. The other thing that is pretty unique to NVIDIA is again, a single architecture in the vehicles, not different chips. It's not some CPUs or some different GPUs or some different deep learning accelerators. It's a single architecture that spans, again, now from ADAS, a front facing camera at 5 watts, all the way up to our robotaxi solution. Another key differentiator is it's open. We do not have a closed sort of black box system, but our software platform is open for developers that can take our DRIVE OS. They can use our libraries, they can use our algorithms. Our DriveWorks or drive these stacks and they can customize and build their own application. This is key and that no automaker wants to outsource their future, which is AV. So, again, this open software defined, software updatable architecture is key to the future. And again, we have a vast ecosystem. So regardless of the kind of sensor they want to use, we work with the Tier 1 suppliers, of course. We're not competing with them, but it's a partnership between NVIDIA, the Tier 1 and the OEM to build these systems to integrate different maps, because of course, we want to have a global vehicle that can operate on any maps. So, the China maps, of course, are different than Europe’s maps. They’re different from United States maps. But having a vast ecosystem of sensors, maps, software, we can all then deliver a global solution to our OEM customers.
So, again, from end-to-end, single scalable, and our vast ecosystem, we're seeing great promise. It's a long-term play for us. There's ups and downs that can be lumpy in terms of the revenue stream, but NVIDIA is committed, as Jensen has said. We're really excited about the opportunity. We have many recent design wins that we've announced and many more to come.
With that, I’ll open it up for questions. Thanks, Raji
Question-and-Answer Session
Q - Raji Gill
Yes. So, thank you, Danny, for that. That was very informative.
Before we get into the technology, I just want to get your thoughts on the market size and your design pipeline. So, you have a single architecture. You have very aggressive solutions for various products. Most of your engagements -- can you describe your engagements? Have they been on Level 2 plus or Level 3, Level 4? We know that Level 4 Level 5 has been pushed out further than what we initially thought because of regulation and liability and uncertainty around market. So, I wanted to kind of dig deeper in terms of your engagements for Level 2, Level 2 plus, which seem to be what the market will be over the next 10 years? And how your solution -- maybe it's more cost effective or has lower power than the competition?
Danny Shapiro
So, you're right. We have a lot of different engagements, a lot of different programs in play right now. Some are going into production. We just announced with Xpeng, a Chinese new energy vehicle. They've been taking orders on their P7 sedan, and it actually starts deliveries this month. So, that's based on the NVIDIA DRIVE AGX Xavier. So, that's our kind of in production part. That's what the Toyota is using, Volvo is using. And so, those, like you said, will be kind of Level 2 plus and in production starting now, over the next several years. The benefit of course of these is it’s all part of the same software updatable architecture. So, these cars get better and better over time.
We also, of course, have a lot of robotaxi type engagements, companies like Zoox, like Aurora and delivering. Those potentially in large volumes are still years out. But, it's really a great development effort and it's still ongoing. We're not seeing any slowdown in that. I think, what we'll see now, of course is, with our new ADAS offering, that's right, that's where that volume number is. And those, we're going to be able to start announcing those contracts soon. And then, those will be several years to get into production. And so, what you see today is from a revenue split, we’re kind of our 50-50 in terms of legacy infotainment business and our RAV development. We've announced, of course that infotainment business will go down a bit in the next quarter due to the overall state of the auto industry in terms of production being soft to slowed and new consumer sales also taking hit. And again, in a few years, so we'll see I think a sizable ramp for AV.
Raji Gill
Will you agree that the market that's available to you or the industry is really kind of Level 2 to Level 3 for the next 10 years or so or longer where we get to a more autonomous vehicle kind of market? And if that is the case, then there's existing kind of competition there. And I'm just wondering how your solution will kind of compete if it's not as advanced requirements that are needed in more Level 2, Level 3 -- versus Level 2, Level 3?
Danny Shapiro
I think, what we will see is the deployment of higher end system in more and more vehicles, even if the software isn't activated. And so, again, there's one particular company out there in the market that's taken this approach has a very high valuation because they're putting an AI computer in the car. They're putting sensors all around the car. And even though it's not a Level 4 car today, it’s selling extremely well because it has a potential to get there as soon as the software and the regulation is in place. So, I think other car companies need to adopt this strategy and they are. And so, even if it's not shipping as a Level 4, it has the capability to be upgraded to a Level 4, and that's really the strategy I see happening, sooner rather than later.
Raji Gill
Can you discuss how your automotive SoCs, your DRIVE platform interacts with other technologies in the vehicles, such as ADAS or the infotainment or the HMI, particularly around sensor fusion? Because that’s where I think there's a lot of differentiation there in terms of vehicles are including more higher advanced sensors, a better example will be LiDAR. And the processing is going to be offloaded to your solution, and that's where you can -- and then it can really can shine as it always does in that area.
Danny Shapiro
It's a great question. I think, our platform again is open, it's software defined. And so, we're not locked into a specific camera or specific algorithm because we've tuned the hardware. We basically bake something in. It's an open platform. So, we have a vast array of different cameras, different sensors. We're going from 1 megapixel cameras to 2 to 8 megapixels. And again, it’s just a software update to allow that flexibility. So, whether it's different sensors, again, we work closely with Tier 1 to integrate their sensors to build the solution for their customers. So, whether it’s different mass, whether it’s different algorithms, that's the beauty of the NVIDIA platform and the base we have on CUDA and now of our deep learning stack. We're able to change it over time, right? And with software updates, right, new algorithms, there's going to be apps that come out a year or two from now that haven’t even been thought of today. But because of the software defined nature of our platform, it's not a problem.
The thing is, today everyone has an iPhone and Android phone. Could you imagine buying a phone that doesn't get software updates? You wouldn't do it, right? You're not going to have this fixed function device. And virtually every car on the road today is like that old phone that you can’t update. So, once consumers start to see what's available and get used to that, there's no way that anyone is going to buy a non-software defined vehicle in the future.
Raji Gill
In the past, in your analyst days, you've mentioned you quantified the TAM for your opportunity, roughly $25 billion. And you’ve kind of identified it, $25 billion for kind of driving Level 2, Level 3 for robotaxis. You have roughly $3 billion opportunity for training and development and about $2 billion for validation, just drilling on the testing and validation, to me, that would seem the kind of -- the low-hanging fruit as these vehicles need to simulate, autonomous scenarios versus driving 1 million miles around physically, manually. So, I’m curious to see what the uptake has been for your simulation products, your training products on DGX. Are you seeing more from the ride-sharing, from the OEMs? Describe that on your constellation product platform.
Danny Shapiro
So, again, you’re right. Constellation is a simulation product. It still is early days. We've basically been shipping that for maybe not even a year. And it's a large development effort. It's going well. And we see that again as a great growth opportunity for us. It's part of this end-to-end flow and gives us the ability to, like you said, test and validate, which is key, and it scales very nicely. So, we see it’s distributed over -- some people are doing it on-prem, right? They're buying there and they're maintaining themselves. Others are working with cloud service providers, so they can buy time on those simulators, and it's a good way for them to kind of start and scale up. We see a lot shifting to on-prem. And so, we've already announced Toyota and Volvo trucks as some of our big customers using us in the data center for training for sim and then deploying on the road. But, you’re absolutely right. Again, if you look at it from a revenue perspective, the timing, data center is very big for automotive and simulation is growing. That gets reported in other parts of the Company. It doesn’t flow into that auto number that’s reported by data center.
Raji Gill
Right. And that's an interesting point. So, data center for automotive is kind of in data center in general. And so, you're seeing kind of an uptick, based on what you're seeing for simulation versus kind of physical…
Danny Shapiro
It’s interesting. And again, even in other -- there are some companies that aren't using NVIDIA in their vehicles, they’re still very large customers of ours for data center.
Raji Gill
Yes, absolutely. So, are there any regulation hurdles? Have they been defined yet to do simulation versus actually manually driving and collecting data miles -- raw miles and the nano taking bikes and whatever objects physically? Is there a regulation standard that's been emerging or is it kind of…
Danny Shapiro
We’re very active in that with standard spot ties, with SAE, with others that we're focused on, ISO 26262. The key thing here has been engagement that we have with the industry, with our partners, we’re working with NITSA. We're working with the federal highway safety boards, and the DoT. And so -- and then also in other parts of the world we’re active. I guess, less about logging miles and disengagements, but more about being able to look at scenarios. And so, what we're trying to do is come up with potentially an autonomous vehicle driver's license. The vehicle has to pass all these different tests in all these different scenarios. You can run the exhaustively in simulation. There's also obviously going to be live on road testing. But, I think we'll see a shift to a lot more virtual testing, because again, you're not going to be able to effectively test those really dangerous scenarios with the real vehicle.
Another reason why the testing is so important is because even when we have safety drivers in our cars or our partners, again, have safety drivers. Those safety drivers are there to make sure nothing bad happens. So, if there is a potential collision impending, that safety driver is going to take over as soon as they feel uncomfortable. And that prevents the AV system from really actually putting -- being put through its basis and tests, would the AV system actually prevent this accident or not. So, again, in simulation, we can let it all play out. And if something happens, we just -- we start the simulation and then we update to softly learn from that. You don't have that luxury in the real world if something bad happens.
Raji Gill
So, just kind of shifting to the technology discussion, we have about 9 or 10 minutes. So, you have migrated your DRIVE platform to your Ampere architecture, 7-nanometer, and for all your upcoming kind of Orin chips. What was the thinking behind that? And is it to really provide this kind of end-to-end single architecture? Everything is on 7-nanometer now. And I'm just curious, if you could elaborate further on the migration to the Ampere architecture.
Danny Shapiro
As a company, we’re constantly innovating, constantly moving forward, right? And that's just the nature of the computing industry. We're always pushing and developing an innovating. Having this single architecture is key to us. It's not like the traditional auto industry where there was something that was developed and it was kind of locked and loaded for five to seven years and something brand new was developed, and then you had to wait. We have this continuous flow. And so, starting with our early DRIVE PX and PX2 systems, people are developing software and they could migrate them to our current architecture and then moving forward to Orin. So, Orin, we’ll start sampling next year. And then, all the development work today is still happening on the current generation, but we're able to basically have this time machine to accelerate the developments that when those chips go into production, it’s not like you have to wait where the software started development. They're all developing today. And so, again, we've seen, the industry in general has underestimated the complexity of this challenge, underestimated the amount of compute required. And so, we're constantly trying to deliver higher performance, greater energy efficiency to the industry. And so, it was a natural move, of course to shift to Orin and Ampere. And again, there is huge performance change, much more so than we've never seen before in the past, in the computing industry.
Raji Gill
So, you talked about kind of 4x the performance and power efficiency over your Xavier solutions. I was wondering if you can go into some details and explain how does Orin do this from a technological point of view. Is it primarily on a process node migration? Is it more on the algorithm side?
Danny Shapiro
It's a combination of both really. Yes. So, the process node. But again, we're able to -- I mean, these are incredibly complex chips, 17 billion transistors. But, we essentially built in different parts of a chip to do different types of processing. We have different types of data coming in and we have different types of algorithms. And some are serial, some are parallel, some are declining base. So, there are a lot of different parts for chips that are now tuned, and that gives us the greater performance with consuming less energy.
Raji Gill
Got it.
Danny Shapiro
Now, we as a company, we still call the GPU, right? We've had all these GPUs. Now, these GPUs are doing different kinds of processing than our daily GPUs from a decade ago. Still has that same name. But, they are really custom ASICs and Xavier and Orin are designed as SoCs for AI challenges and specifically autonomous vehicles.
Raji Gill
And when we look at the -- if we look at kind on robotics, kind of shifting that, and kind of more on the robotaxis, -- or I wouldn’t say robotaxis, let's say trucking, which seems to be more of the immediate application for AVs. In robotics, you stated that BMW has selected your Isaac robotics platform to automate a lot of their factories. You're using logistic robots that are built on AI computing visualization. How big -- what percentage of your automotive business do you think that will be on robotics side? Is this going to be a meaningful kind of growth driver, robotics as well as on kind of on trucking and transportation?
Danny Shapiro
Yes. I think robotics is a different segment from automotive. It's part of our OEM business. But, I think there's great potential, right? I mean, everything that moves will be automated in some fashion. So, that's passenger vehicles; that's moving of goods; and I think, we're going to see a lot more, like we said from the trucking and last mile delivery before we're moving people around. But, then, there's also these -- but there is fixed robots and assembly lines more cobots, instead of robots that have to shut down whenever a human gets there or a manufacturing of the big cases around the robots. We're going to see a lot more really smart robots that are able to sense their environment and be able to work alongside humans as well as logistics, whether it's in factories, warehouses, even in healthcare, the market opportunity is very large there. We see great growth potential.
Raji Gill
Great. And then, just last question in terms of your product portfolio. You talked about mounted in the windshield solution that's 10 trillion operations per second in less than 5 watts, all the way to robotaxis. What has been the feedback from the OEMs? I understand the advantage of having a single architecture, but are you getting any pushback in terms of being a sole supplier, they want diversification or is it really more ease of use and the ability to kind of update the software as kind of the vehicle life cycle changes from 5 to 7 years and more to kind of mobile phone life cycle where every 6 to 9 months is going to be updated. There seems to be an underlying shift to that type of manufacturing cycle.
Danny Shapiro
Yes. This product expansion is a direct result of customer requests. I think you probably heard me or others saying that we're not going to go into that commodity smart camera business. That's not where our engineering expertise is going to be leveraged. We're trying to solve the new problems, the challenge that hadn't been solved before. The reality is, is that the complexity of cars is enormous, the amount of investment that car makers need to make, R&D is just astronomical. And so, to maintain development engineering of one front facing camera architecture, and then have something separate, totally different code base, totally different hardware platforms or other aspects, it doesn't make sense for that. And so, they’ve come to us and said, hey, we want to leverage what we're doing at the high end and just be able to bring it down into that front facing camera solution. So, again, we see carmakers adopting our platform across the whole spectrum and seeing huge savings from that as opposed to just saying, okay I want to pick this one part of this one segment. And again, it's not just about what goes in the car, but training that AI, they don't want to have two totally different systems but they're collecting data, two different initiatives. They're doing different development, different training, different testing, different validation when they can do all this across a single architecture. It’s an enormous cost.
Raji Gill
So your ability to kind of train and import that learning from the data center directly into the car, so the car is able to make inference decisions, driving software policies based on the data that you trained on…
Danny Shapiro
That’s right.
Raji Gill
…is a fairly big differentiator.
Danny Shapiro
It's enormous. As you talk to automakers, they -- and this is why sometimes you see these partnerships going on because it is an enormous investment. And again, we can take two different investments for the same car company and combine them.
Raji Gill
Okay. All right. Wonderful. I think we're heating up it. We're about 11:55. So, we’ll leave it there. Thank you so much, Danny. Thank you Stewart. This is very insightful. And thanks again.
Danny Shapiro
Thank you.
- Read more current NVDA analysis and news
- View all earnings call transcripts