Autonomous vehicle

A illustration shows how a vehicle with its technology will communicate with humans on or near the road.

A desire for safer roads and fewer car accidents is the motivation behind startup autonomous driving tech company’s Deep Learning model for self-driving technology, Darrell Etherington writes for Tech Crunch.

Carol Reiley

Carol Reiley, President and co-founder of

Carol Reiley, president of, is an engineer who had worked in robotics for 15 years before, with applications ranging from industrial to surgical. Her desire to save lives began in childhood: As an 8-year-old girl, she created her all-time favorite DIY project, a humane mousetrap, according to Johns Hopkins University’s Gazette.

Six members of the startup’s eight-person founding team were PhD or graduate students at Stanford University’s artificial intelligence lab, and worked in the area of deep learning and vehicle technology for three years before Carol Reiley co-founded last year.

Road Safety

It was the need for safety on the roads that spurred Reiley to co-found, saying she asked herself:

How do you build robots to intersect and help humanity? Across the different applications, how do you make the most impact and help people. How the biggest problem really facing today is humans driving cars. Humans are terrible, terrible drivers, and have caused in the U.S. alone 33,000 or so fatalities every year.

Add-on Artificial Intelligence’s approach is to use Deep Learning to create a technology that can be added on to vehicles to help them safely navigate autonomously. Deep Learning is similar to the way a person teaches another person how to do something: The system itself learns how to apply a set of rules to a wide range of unexpected situations, such as people doing cartwheels across the road, a person running circles around a test vehicle, and dogs on skateboards.

Reiley said Deep Learning is important because:

A rule-based approach for something like a human on a bicycle will probably break if you see different scenarios or different viewpoints.

Hours of Test Driving

Deep Learning involves a very large number of hours of test driving to give the system basic information. Although is not revealing exactly how many miles its test vehicles have driven, Reiley told Business Insider’s Danielle Muoio that when it comes to driving the most miles, the company is already in the top 5 of the 13 companies California has licensed to test autonomous vehicles on its roads.’s system, which Reiley calls the brains of the car, is made up of sensors, LiDAR, and radar, but the company is one of a very few also using Deep Learning software. Cambridge University researchers and Toyota have also been working on deep learning/artificial intelligence for driverless cars. But the Stanford AI lab is unique, Reiley told Tech Crunch, because of the top three or four deep learning labs worldwide, it is the only one that has worked on deep learning as it applies to cars.

Creating a New Language

Safer driving requires not only an effective “left brain” for autonomous vehicles, but also a functioning “right brain,” involving social skills, Reiley said. Because an autonomous vehicle is really the first social robot that many humans will interact with, the team is working on bringing those skills to its technology.

The company is creating a new language, one that helps the vehicle communicate with people on or near the road. One way the vehicles will communicate is with a roof-mounted exterior communication device, which will provide written messages and emojis to let humans near the car know what its intentions are. is adding veteran auto industry executive Steve Girsky to its board. Although his background includes having helped GM do a turn-around after its bankruptcy as a GM senior executive and board member, he wanted to join to help bring about an era of safer driving.

Embed this infographic:
Embed this image: