Google self-driving car

Google self-driving car in Mountain View, California

In what one journalist is calling a “little blemish” to the driving record of Google’s self-driving cars: Recently, a Google autonomous Lexus SUV got into a minor accident for which it was partly to blame: The car, which was driving at only 2 mph, ran into the side of a bus, which was going 15 mph, Nick Lavars reported for Gizmag. No one was injured.

This is the first time Google has admitted to fault in seven years and more than 1 million miles of testing of its autonomous vehicles that. In previous accidents, vehicles driven by humans rear-ended Google cars.

It was Mark Harris who first reported, on Twitter, that Google said it was at least partly at fault for an accident that one of its self-driving cars was involved in. Harris posted a California Division of Motor Vehicles report for the accident, which took place February 14 at the intersection of El Camino Real and Castro Street in Mountain View, California.

Details of Accident

In an article for The Verge, Chris Ziegler includes Google’s response to the accident in its monthly self-driving report. Google wrote that El Camino Real is a wide boulevard with three lanes in each direction, a busy road with hundreds of sets of traffic lights and intersections.

Google goes on to explain:

El Camino has quite a few right-hand lanes wide enough to allow two lines of traffic. Most of the time it makes sense to drive in the middle of a lane. But when you’re teeing up a right-hand turn in a lane wide enough to handle two streams of traffic, annoyed traffic stacks up behind you. So several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do: hug the rightmost side of the lane. This is the social norm because a turning vehicle often has to pause and wait for pedestrians; hugging the curb allows other drivers to continue on their way by passing on the left. It’s vital for us to develop advanced skills that respect not just the letter of the traffic code but the spirit of the road.

Making Assumptions

The accident occurred when the Google vehicle, which had pulled to the right-hand curb to get ready to make a right turn, detected sandbags near a storm drain and had to stop. It waited for several vehicles to pass and then moved towards the center of the lane at about 2 mph when it made contact with the bus, which was passing at 15 mph. Although Google’s car had detected the bus, it predicted that the bus would yield to it because the car was ahead of it. The human test driver in the Google car also expected the bus to slow or stop, to allow the Google car to merge, and thought that perhaps the bus driver assumed the Google car would not move. These assumptions led to both vehicles moving to the same part of the lane at the same time, pointing out that human drivers often have these kinds of misunderstandings on the road. “In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.”

Refining Self-Driving Software

As a result of that incident, along with “thousands of variations on it,” Google has refined the car’s software. The company says the car will now more deeply understand that large vehicles such as buses are less likely to yield than smaller vehicles. The cars should handle such situations in a more graceful manner from now on.

Lack of Intuition

In an opinion piece on Slate, Samuel English Anthony, who studies the intersection of human and computer vision at Harvard University’s Vision Sciences Lab, writes that self-driving cars lack the ability to intuit what pedestrians and other drivers may do, and it may be years before autonomous cars have that ability. Even the best artificial intelligence systems can be fooled: “State-of-the-art object recognition systems can be tricked into thinking a picture of an orange is really an ostrich.”

You can see the damage to the Google car in this witness video:

Image by Mark Doliner, used under its Creative Commons license.