Sehr interessanter Beitrag, wie Waymo das Design von Google nutzt, um Vertrauen beim Nutzer für das Autonome Fahren zu schaffen.
Keine News mehr verpassen! Jetzt den Newsletter abonnieren .
We’ve had to rethink all the micro-interactions that happen between riders and human drivers, starting with hailing the ride itself. With pickups, there’s a twist here that’s unique to us: Unlike humans, our cars are programmed to adhere to traffic laws, so they won’t just stop anywhere, and you can’t just signal to them. We tried a bunch of concepts and we came to appreciate more and more that map-based UIs get really complex, because there’s so much other information. We leverage Google Maps in our app, and some of our early ideas were misunderstood as signifying traffic, instead of where the car can pull up. In the app, we use blue-shaded regions to indicate where our cars can safely pull over to get you—it’s the solution that early riders understood most clearly.
That said, this all works a little differently if you’re vision-impaired: We’re experimenting with a feature that lets you honk the horn from the app, as a wayfinding mechanism for the car. More advanced audible messaging is also available: When you set up the app for the first time, you have the option of turning on certain features, one of which includes more voice feedback. So when you get into the car, for example, it can let you know where the ‘start ride’ button is, or when you’re crossing through a certain intersection, so that you can get a sense of where you are.