Sehr interessante Einsichten in die Entwicklung des Autonomen Fahrens bei NVIDIA.
Autonomous vehicles must be able to operate in thousands of conditions around the world to be truly driverless. The key to reaching this level of capability is mountains of data. To put that in perspective, a fleet of just 50 vehicles driving six hours a day generates about 1.6 petabytes of sensor data a day. If all that data were stored on standard 1GB flash drives, they’d cover more than 100 football fields. This data must then be curated and labeled to train the deep neural networks (DNNs) that will run in the car, performing a variety of dedicated functions, such as object detection and localization. The infrastructure to train and test this software must include high-performance supercomputers to handle these enormous data needs. To run efficiently, the system must be able to intelligently curate and organize this data. Finally, it must be traceable — making it easy to find and fix bugs in the process — and repeatable, going over the same scenario over and over again to ensure a DNN’s proficiency.