Google’s Autonomous Car Learns at Halloween Demo
Google used Halloween as an opportunity to demonstrate one of the ways its self-driving car has learned to be aware of pedestrians, including small children in Halloween costumes. An article by Kirsten Korosec Monday in Fortune talked about how children helped Google with its work Saturday — they were invited to Google’s Mountain View, California, headquarters to stand and walk around its parked cars to give the sensors and software extra practice.
Google wrote in a blog post Monday that it teaches its autonomous cars to be especially cautious when children are in the area, adding:
When our sensors detect children — costumed or not — in the vicinity, our software understands that they may behave differently. Children’s movements can be more unpredictable — suddenly darting across the road or running down a sidewalk — and they’re easily obscured behind parked cars. So even if our cars can’t quite appreciate the effort the kids put in dressing as their favorite character from Frozen, they’re still paying full attention!
Google has said its software is continuously learning to spot dangerous road situations and find ways to avoid them, wrote Max Lewontin for The Christian Science Monitor. Its self-driving cars have logged about 1.7 million miles of self-driving and manual driving combined. Lewontin quotes Chris Urmson, director of Google’s self-driving project, as saying in July 2014: “As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer.” Although Google’s autonomous cars have been in a number of accidents, the company says those were caused by human drivers in other vehicles.
Humans Blamed for Accidents
According to a recent study by the University of Michigan’s Transportation Research Institute, self-driving cars were five times as likely to be involved in accidents as cars driven by humans, except that the self-driving vehicles were not the ones to blame.
In an article by Timothy B. Lee for Vox, Brandon Schoettle, one of the researchers who conducted the study, said the cause of accidents may be that self-driving cars “may be behaving in ways that surprise human drivers who are used to interacting with other human beings.” For example, Lee said it’s possible the autonomous cars are sometimes stopping more quickly than a human driver would. The accidents the self-driving vehicles have been involved in were mostly ones in which the autonomous vehicles were rear-ended by a car driven by a human.
Lee said that one caveat to the study’s findings is that minor human-caused accidents often are not even reported, especially when there are no injuries or property damage. But by contrast, companies testing self-driving vehicles are required to report every accident.
There are still matters to be worked out. Since autonomous cars have been tested in states with mild winters, they have not yet learned to navigate weather conditions like snow, ice, and sleet. Also, it’s possible the self-driving cars have been tested in less demanding traffic than that typically faced by conventional cars.
As this blog reported, Colorado Gov. John Hickenlooper tried out a semi-autonomous Corvette last week. He said driving the hands-free 2014 C7 Corvette Stingray — modified by Arrow Electronics so disabled people can drive it using only head movements — was one of the most amazing things he has ever done.