A new software system developed by a University of Michigan doctoral student could eventually help to reduce the cost of self-driving vehicles by allowing them to see their surroundings and determine their location with only one video camera instead of several laser scanners, as Willie Jones writes for IEEE Spectrum. Green Car Congress reports that grad student Ryan W. Wolcott’s paper about his invention was recently named the best student paper at the Conference on Intelligent Robots and Systems in Chicago.
Wolcott’s system uses the same navigation system approach used in self-driving cars now in development — including Google’s — but goes beyond it. The existing systems create a real-time map of a vehicle’s surroundings via three-dimensional laser scanning technology and compare them with a stored, pre-drawn map, GCC writes. By comparing the maps thousands of times per second, those systems can pinpoint a vehicle’s location within a few centimeters, GCC writes. Wolcott’s improvement converts the map data into a three-dimensional image, kind of like a video game, GCC writes. The system can then compare the real-world pictures streaming from a video camera with the 3-D images, GCC writes.
GCC quotes Wolcott as saying he began work on his system hoping to find a “cheaper sensor that could do the same job” as the laser scanners used in most self-driving cars being developed, which costs tens of thousands of dollars. Walcott said cameras were “an obvious choice” because they cost so much less.
Ryan M. Eustice, a U-M associate professor of naval and marine engineering, has been working with Wolcott on the technology, GCC writes. He said an important challenge for the new design was to come up with a system that could process a huge amount of video data in real time.
Video games once again inspired the team, which built a system using inexpensive graphics processing technology known to gamers, GCC writes. Walcott and Eustice have tested the system in downtown Ann Arbor, and found it provided accurate location information, GCC writes. U-M’s new Mobility Transformation Facility testing center, which features a private, enclosed city grid, will give the team another place to conduct further testing when it opens this summer, Jones writes for IEEE Spectrum.
GCC writes that although Wolcott’s system will not completely replace laser scanners, which are needed for such functions as long-range obstacle detection, it is a big step towards lower cost navigation systems:
‘Map-based navigation is going to be an important part of the first wave of driverless vehicles, but it does have limitations—you can’t drive anywhere that’s not on the map. Putting cameras in cars and exploring what we can do with them is an early step toward cars that have human-level perception,’ [said Ryan Eustice].
Here is a video about the team’s work: