Autonome Fahrzeuge sind höchst abhängig von guten Sensoren. Jetzt gibt es einen neuen Sensor, der eine menschenähnliche Sicht ermöglicht. Spannende Technologie wie wir finden!
What you see is being warped by the inner workings of your brain, prioritizing detail at the center of the scene while keeping attention on the peripheries to spot danger. Luis Dussan thinks that autonomous cars should have that ability, too.
His startup, AEye, has built a new kind of hybrid sensor that seeks to make that possible. The device contains a solid-state lidar, a low-light camera, and chips to run embedded artificial-intelligence algorithms that can reprogram on the fly how the hardware is being used. That allows the system to prioritize where it’s looking in order to give vehicles a more refined view of the world.
AEye wants to use solid-state devices a little differently, programming them to spit out laser beams in focused areas instead of a regular grid. The firm isn’t revealing detailed specifications on how accurately it can steer the beam yet, but it does say it should be able to see as far as 300 meters with an angular resolution as small as 0.1 degrees. That’s as good as market-leading mechanical devices.
AEye’s setup doesn’t scan a whole scene in such high levels of detail all the time, though: it will scan certain areas at lower resolution, and other areas at higher resolution, depending on the priority of the car’s control software.