Autonomes Fahren: Warum NVIDIA eine zentrale Rolle spielt

Interessanter Artikel über die zentrale Rolle von NVIDIA beim autonomen Fahren.

Some are aware that Alphabet (GOOGL) (GOOG) has a self-driving segment, Waymo, but rarely do they know it's been operating in Phoenix for a year, providing 400 rides a day. Others know that General Motors (GM) is in on the mix and that Ford Motor Co (F) is working on a smart city concept. Others more knowledge may have heard of some of the startups or that Daimler (DDAIF) is teaming up with Bosch and Nvidia (NVDA) for its autonomous taxi ambitions. That brings us to Nvidia, a company that, short of those who follow its stock or the AV segment closely, don't seem to realize that it's the likely winner from autonomous driving. Why is Nvidia a likely winner? It doesn't program the systems that pilot AVs nor does it make all the hardware components that may be commoditized in the future. But it develops all the necessary equipment to make autonomous driving efficient, safe and a reality. Nvidia is also quietly building an ecosystem that will thrive as autonomous driving gains traction over the years. How? Let's break it down with a little help from Nvidia's senior director of automotive Danny Shapiro. I was able to catch up with Shapiro after it was announced that Daimler and Bosch selected the company to power to its mobility-as-a-service ambitions. To start, Nvidia's DRIVE Pegasus is the onboard source that makes self-driving cars a reality. It's roughly the size of a license plate and Shapiro calls it the most energy-efficient solution in the market right now. The system can deliver 320 trillion deep learning operations per second. Put simply, it's the car's self-driving brain, capable of providing up to level 4 and 5 autonomous driving. Beyond that, the company recently introduced DRIVE Constellation. This product is not yet available to everyone, but when it is, it can drastically change the autonomous driving landscape. In essence, Constellation operates on two separate servers. Server No. 1 will be the car's sensors and self-driving platform, while server No. 2 will be capable of running a near-endless amount of different scenarios. To the car's system, these synthetic tests will seem real and by doing it artificially, will drastically increase the rate in which the vehicle learns what to do. Constellation will make it so that we don't have to wait for those few precise minutes near dusk where the sun is shining directly into the car's sensors to see how the vehicle will react while on an actual road, potentially endangering the test driver and others on the road. Instead, we can dial up that scenario on server No. 2 -- and countless other scenarios -- run it over and over and over again, then tweak how the car interprets what to do. This will exponentially cut down the learning curve, but also drastically increase the safety of autonomous vehicles. Instead of manually testing 1 million miles in the real world, this will allow a tester to go through 1 billion miles. It will provide a "path to development, test and validate in the datacenter where we can virtually drive autonomous vehicles," Shapiro said. That brings up an interesting point: The datacenter. All of this requires a vast amount of power and includes a vast amount of data. If only there were a company out that there had bountiful datacenter solutions. Oh wait. There's Nvidia.

Menü schließen