Autonomes Fahren: Die Technologien dahinter bei Mercedes-Benz

Interessanter Blog-Beitrag über die Technologien hinter dem Autonomen Fahren bei Mercedes-Benz.

Of red pedestrians and blue cars
The camera is one of the three important components that are essential to automated driving. It constitutes the “eyes” of the car. In order to really understand the function and the importance of the camera, I’ve been privileged to sit in a test vehicle, which is actually more technology than car.

Markus Braun, a specialist for pattern recognition/camera and driver of the test vehicle, explains to me that the camera system mainly focuses on the more vulnerable road users: pedestrians, bike riders, and motorcyclists. Of course the self-driving car must also keep an eye on all the other objects, such as cars, streetlights, and traffic lights.
In order to train the system as effectively as possible, it was tested in the road traffic of two different countries, Germany and Italy. The images captured there serve as the data from which the car’s neural network learns. This is a typical application of artificial intelligence.
On the display, all of the objects moving in front of us have been assigned different colors. The cars are blue, and the pedestrians are red. Almost 20 images are projected per second — a few less than the 25 images per second that are perceived by the human eye.

The predictive camera
The vehicle has learned quite a lot in recent years. Its camera can now detect people behind parked cars, trash cans or streetlights and distinguish individual pedestrians separately from others. A few years ago, the camera could only perceive groups of people. This means that it’s worthwhile to train the system. The algorithm is constantly improving itself, in the same way that a soccer player improves if he gets the right training and thus develops better game tactics.
But it gets even better: Today the car, with the help of the camera, can even anticipate what a pedestrian intends to do next. By registering the pedestrian’s posture and head movements, the car can calculate in which direction he or she will move and whether he or she will soon cross the street. It’s quite a challenge to determine whether a pedestrian will stay put or take advantage of the last opportunity to cross the street.

Nonetheless, our systems can already precisely identify pedestrians and analyze their intentions at a distance of up to 50 meters. In some cases they can even do so at greater distances — and this performance does not decrease in conditions of poor visibility due to rain or darkness.
If the camera is still prevented from conducting its analysis — for example, by glare from an oncoming car — it is supported by a second type of sensor: radar.
Radar’s unobstructed view
For automated driving, radar is just as important as the camera. It’s located in the car’s bumpers, and it emits electromagnetic waves that supply information about the objects in front of it. Today the test vehicles are equipped with as many as eight radar sensors in order to cover the car’s entire surroundings. What human driver can claim to have eight eyes?

Radar can distinguish between static and dynamic objects — in other words, moving vehicles and parked vehicles. Today it can also precisely register the shape of objects. Because radar can “see through” objects such as cars by looking under them, the system can estimate the length and width of cars driving ahead and quickly detect people behind cars. Our human eyesight can’t compete with this kind of “vision.”
I can hardly imagine how such a tiny component can do something so important and thus make the driving of the future so much safer.

On the road to accident-free driving with lidar
In order to optimally perceive the car’s surroundings and plan ahead even more precisely, the third type of sensor, lidar, makes it possible to precisely measure all distances, as well as the reflectivity of objects, with centimeter precision. Instead of electromagnetic waves, lidar uses laser pulses, which have a shorter range than radar but produce more precise images.
Various scanners are used to measure several different distances. Because this process uses infrared light, it cannot be perceived by the human eye. There’s only one problem: Lidar cannot perceive colors. Nonetheless, the approximately 10,000 measurements it makes per second give the automated vehicle precise information. As a result, lidar plays an important role on the road to accident-free driving.

Keine News mehr verpassen!

Quelle: blog.daimler.com