© 2024 Kansas City Public Radio
NPR in Kansas City
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Google's Traffic Stop: A Crash Course In Self-Driving Concerns

Google's self-driving prototype car can be found cruising the streets near the Internet company's Silicon Valley headquarters to test programming responses to a variety of situations.
Tony Avelar
/
AP

Google's self-driving car is seeming more and more human. And like the rest of us, it's subject to traffic stops.

The head of Google's rapid rollout lab, David Weekly, tweeted a photo Thursday of the prototypical car stopped by a motorcycle officer. Apparently, the vehicle was going too slowly in a 35 mph zone, causing traffic to snarl.

That's because the cars' speed is capped at 25 mph, a move by Google to make the cars a little less of a mystery and more "friendly and approachable, rather than zooming scarily through neighborhood streets," the company says in the project's blog.

The company was quick to point out that the cars have never been ticketed, despite logging 1.2 million miles of autonomous driving. That's the human equivalent of 90 years of driving — and you thought logging driver's ed miles with your teenager was a lot.

Google's little run-in with the police is happening against a myriad of questions surrounding self-driving cars.

Sometimes, obeying the law can actually be the wrongchoice when it comes to defensive driving. As The Atlantic points out, sometimes good judgment means acting illegally.

Imagine, for example, a deer is standing in your lane. No cars are approaching. A defensive driver would probably slow down and go around it, encroaching into the other lane. But an automated car, following the law to a T, might come to a full stop — and avoid crossing a double-yellow line. Envision the pileup that might ensue.

Instead of programming millions of directions for specific instances — as wacky as a woman in an electric wheelchair chasing a duck — Google is teaching cars how to respond to the more fundamental aspects of unpredictable driving and respond to a wide variety of instances accordingly.

Even worse: What about the instance of an unavoidable accident? Cars could be programmed to either a) minimize the loss of life — even if it means sacrificing the driver or b) protect riders at all costs. Would the answer be the same every time, or would it be random?

Information on the situation's surroundings would be determined by a particular scanning system, called Lidar. Using illumination invisible to the human eye, it captures information and converts it to a 3-D model of the scene. Despite Lidar's extreme accuracy, there will always be sensory limitations.

Back to the traffic stop. How do you program cars to respond appropriately to the all-too-familiar flashing lights in the rearview mirror? It makes us wonder: Could self-driving cars be programmed, in some way, to avoid traffic infractions — or even the police? What if self-driving cars were pulled over ... by self-driving police cars?

In the meantime, the matter of ticketing itself is still pretty fuzzy. Who, or what, gets the ticket? Google has said that if one of its cars breaks a law it'll foot the bill, but sometimes state law isn't quite as clear — especially if there isn't anyone in the driver's seat.

Given the plethora of ethical repercussions and implications surrounding self-driving cars, there's still a lot of thinking, and programming, to be done. And with the Institute of Electrical and Electronics Engineers predicting that these vehicles will make up 75 percent of traffic by 2040, we'll have to make some of these decisions soon.

Copyright 2020 NPR. To see more, visit https://www.npr.org.

Kylie Mohr
KCUR prides ourselves on bringing local journalism to the public without a paywall — ever.

Our reporting will always be free for you to read. But it's not free to produce.

As a nonprofit, we rely on your donations to keep operating and trying new things. If you value our work, consider becoming a member.