Google Driverless Cars Get Into More Accidents Following Road Rules
//cdn.themis-media.com/media/global/images/library/deriv/1023/1023105.jpg
Google's driverless cars are getting into more accidents because they follow the rules of the road and other drivers do not.
Google's driverless cars are an accident waiting to happen. Apparently, the vehicles have a crash rating of double that of human drivers, according to a University of Michigan study. But it's for a reason you might not suspect.
The cars follow ALL traffic rules, ALL the time. Without exception. Human drivers obviously do not. Luckily,all the accidents to this point have been minor ones and the Google cars have never been at fault. But that hasn't stopped engineers from debating whether they should program the vehicles to break the law once in awhile.
"It's a constant debate inside our group," Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh, told MSN [http://www.msn.com/en-us/money/technologyinvesting/humans-are-slamming-into-driverless-cars-exposing-key-flaw/ar-BBnH1CJ]. "And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people."
An example of one of the accidents was where the Google car came to a complete stop at a red light and wanted to make a right turn on red. It slowly started to inch out so its sensors could get a better "look" at the traffic. Another car behind it inched forward also, but right into the driverless car's bumper.
The Google cars have also been pulled over on occasion, usually fro going too slow, such as one incident in Mountain View, CA, where the car was doing 24 in a 35-mph zone. The engineers in the car were warned, but police said that sometimes the cars tend to be too cautious.
Google disagrees. "We err on the conservative side," said Dmitri Dolgov, principal engineer of the program. "(The cars are) a little bit like a cautious student driver or a grandma."
Google is looking at more "aggressive" programming to more replicate law-abiding human drivers so they can more easily fit into traffic flow, but the process is ongoing.
"These vehicles are either stopping in a situation or slowing down when a human driver might not," said Brandon Schoettle, co-author of the Michigan study. "They're a little faster to react, taking drivers behind them off guard."
But programming them to break the law? Nothing egregious like speeding or running a stop sign, but more like when to cross a double yellow line if construction or a bicycle is present.
"It's a sticky area," Schoettle said. "If you program them to not follow the law, how much do you let them break the law?"
Source: MSN [http://www.msn.com/en-us/money/technologyinvesting/humans-are-slamming-into-driverless-cars-exposing-key-flaw/ar-BBnH1CJ]
Permalink
//cdn.themis-media.com/media/global/images/library/deriv/1023/1023105.jpg
Google's driverless cars are getting into more accidents because they follow the rules of the road and other drivers do not.
Google's driverless cars are an accident waiting to happen. Apparently, the vehicles have a crash rating of double that of human drivers, according to a University of Michigan study. But it's for a reason you might not suspect.
The cars follow ALL traffic rules, ALL the time. Without exception. Human drivers obviously do not. Luckily,all the accidents to this point have been minor ones and the Google cars have never been at fault. But that hasn't stopped engineers from debating whether they should program the vehicles to break the law once in awhile.
"It's a constant debate inside our group," Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh, told MSN [http://www.msn.com/en-us/money/technologyinvesting/humans-are-slamming-into-driverless-cars-exposing-key-flaw/ar-BBnH1CJ]. "And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people."
An example of one of the accidents was where the Google car came to a complete stop at a red light and wanted to make a right turn on red. It slowly started to inch out so its sensors could get a better "look" at the traffic. Another car behind it inched forward also, but right into the driverless car's bumper.
The Google cars have also been pulled over on occasion, usually fro going too slow, such as one incident in Mountain View, CA, where the car was doing 24 in a 35-mph zone. The engineers in the car were warned, but police said that sometimes the cars tend to be too cautious.
Google disagrees. "We err on the conservative side," said Dmitri Dolgov, principal engineer of the program. "(The cars are) a little bit like a cautious student driver or a grandma."
Google is looking at more "aggressive" programming to more replicate law-abiding human drivers so they can more easily fit into traffic flow, but the process is ongoing.
"These vehicles are either stopping in a situation or slowing down when a human driver might not," said Brandon Schoettle, co-author of the Michigan study. "They're a little faster to react, taking drivers behind them off guard."
But programming them to break the law? Nothing egregious like speeding or running a stop sign, but more like when to cross a double yellow line if construction or a bicycle is present.
"It's a sticky area," Schoettle said. "If you program them to not follow the law, how much do you let them break the law?"
Source: MSN [http://www.msn.com/en-us/money/technologyinvesting/humans-are-slamming-into-driverless-cars-exposing-key-flaw/ar-BBnH1CJ]
Permalink