If you’re not reaching, engaging, and monetizing customers on mobile, you’re likely losing them to someone else. Register now for the 8th annual MobileBeat
, July 13-14, where the best and brightest will be exploring the latest strategies and tactics in the mobile space.
Google’s self-driving cars are designed to exceed the speed limit by up to 10 miles per hour because stubbornly obeying the law can be a danger to everyone on the road. The legal and philosophical consequences of this fact are fascinating.
In a recent Reuters review of the Google car, reporter Paul Ingrassia was told:
Google’s driverless car is programmed to stay within the speed limit, mostly. Research shows that sticking to the speed limit when other cars are going much faster actually can be dangerous … so its autonomous car can go up to 10 mph (16 kph) above the speed limit when traffic conditions warrant.
This is a fascinating quandary for Google’s engineers. No one knows for sure who is responsible when an automated vehicle breaks the law. If a flawed algorithm sends a driver down a one-way road due to a missing road sign, who is responsible? Is it the Google engineer, or perhaps a the construction crew that screwed up the road sign?
As states slowly allow for automated car testing on their roads, they’re piecing together the legality one situation at a time. The old standards of determining guilt may not apply to robots.
“Criminal law is going to be looking for a guilty mind, a particular mental state — should this person have known better?” University of Washington’s Ryan Calo, an expert in tech law, told the New York Times. “If you’re not driving the car, it’s going to be difficult.”
But the idea of programming a car to break the law ahead of time is perhaps the most fascinating legal question. Sometimes the spirit of the law contradicts the written law. Laws are designed to save lives, but they aren’t flexible enough to deal with every single situation.
Thanks to the big data gathered on car accidents, Google will know when speeding is actually safe and when it isn’t. Right now, Google engineers may suspect that exceeding the speed limit when other cars are also speeding is the safest thing to do. And a law could be designed to allow Google’s self-driving cars to legally speed under those general circumstances.
But as Google collects more data, its decisions will increasingly become more unpredictable.
Perhaps it is safest to speed in excess of 20 miles per hour on St. Patrick’s Day, because going near the speed limit around drunk drivers agitates them and makes them more reckless. We doubt that a law could ever really be flexible enough to know when it should be permissible for Google to speed.
And even if a robot is allowed to break the law, who decides the conditions? Is it permissible to speed when doing so would cause a 10 percent decrease in potential for a traffic fatality. What about 5 percent?
These are questions that that will ultimately have to be answered by a regulatory body. For now, it’s all in Google’s hands.
Google's innovative search technologies connect millions of people around the world with information every day. Founded in 1998 by Stanford Ph.D. students Larry Page and Sergey Brin, Google today is a top web property in all major glob... read more »
Powered by VBProfiles
VB’s research team is studying mobile user acquisition...
Chime in here, and we’ll share the results