Local (303) 454-8000
Toll Free (800) ROSEN-911
Se Habla Español
Contact Me For a Free Consultation

MIT’s Intelligent Co-Pilot Is Designed to Prevent Accidents

| (Google+)

A Ph.D. student in Massachusetts Institute of Technology’s (MIT’s) Department of Mechanical Engineeering and a principal research scientist in MIT’s Robotic Mobility Group have developed a semiautonomous safety system designed to take over a vehicle’s controls if the driver is about to have an accident.

An MITnews article by Jennifer Chu quotes Sterling Anderson, the Ph.D. student, as saying, “The real innovation is enabling the car to share [control] with you. If you want to drive, it’ll just […] make sure you don’t hit anything.” Anderson has been testing the system in Michigan since last September. He and the research scientist, Karl Iagnemma, and their group presented details of the system recently at the Intelligent Vehicles Symposium in Spain.

As Sebastian Anthony writes for ExtremeTech, the MIT co-pilot identifies obstacles on the road with an on-board camera and a laser rangefinder. The system creates constraints by combining the information about obstacles with various data points, such as the driver’s performance, and the car’s speed, stability, and physical characteristics. The co-pilot remains in the background unless a driver comes close to breaking one of the constraints, which, Anthony notes, “might be as simple as a car in front braking quickly, or as complex as taking a corner too quickly.” At that point, the co-pilot takes command, and gives control back to the driver only when the car is safe.

Anthony compares MIT’s semiautonomous system with autonomous ones, like Google‘s:

This intelligent co-pilot is starkly contrasted with Google’s self-driving cars, which are completely computer-controlled unless you lean forward, put your hands on the wheel, and take over. This is a lot like an airplane’s auto-pilot, where the human pilot only takes over if something goes wrong, or there are adverse conditions that the flight control system can’t handle. The problem with the ‘human backup’ approach is that you’re asking someone who has been leaning back for X hours to suddenly take over the controls, which results in panicky behavior. It recently emerged that the Air France Flight 447 accident was caused by the autopilot disengaging, and then the human pilot making a silly mistake.

In the more than 1,200 trials the MIT team has run of its system, there were few collisions, Chu reports, and most of those occurred when glitches in the vehicle’s camera failed to identify an obstacle. “For the most part,” she writes, “the system has successfully helped drivers avoid collisions.”

Chu spoke with Benjamin Saltsman, manager of intelligent truck vehicle technology and innovation at Eaton Corp., who was not involved with the research. He said the semiautonomous system has several advantages over fully autonomous ones like those developed by Google and Ford, which are “loaded with expensive sensors” and require complex computations to plan out safe routes. MIT’s lighter, simpler system is less costly and closer to potential implementation, he said.

Responding to Chu’s question about what the system would feel like for a driver who did not know it was activated, Sterling Anderson said:

‘You would likely just think you’re a talented driver.’ ‘You’d say, ‘Hey, I pulled this off,’ and you wouldn’t know that the car is changing things behind the scenes to make sure the vehicle remains safe, even if your inputs are not.’

Chu writes that Anderson acknowledges that the semiautonomous system might cause people just learning to drive to think they are better drivers than they actually are, and without negative feedback, they could actually become less skilled and more dependent on assistance over time. Anderson said that on the other hand, expert drivers could feel constrained by the system. He and Iagnemma are looking into ways to tailor the system to different levels of driving experience.

ExtremeTech‘s Anthony raises the specter of some time in the future when no one will know how to drive. He goes on to say:

‘This isn’t so bad if every car on the road is autonomous, and if steering wheels are removed altogether, but the in between period could be tricky. To this end, the MIT team admits that their co-pilot needs to be tweaked to deliver significant negative feedback so that drivers (especially learners!) don’t get too big for their britches. […]

It will be interesting to see how the autonomous vs. semi-autonomous battle pans out. As we’ve covered before, the semi-autonomous solutions that keep your car in-lane or slam the brakes on when a human walks into the road could be a huge boon to road safety. A completely autonomous road system with car-to-car communications would improve safety as well, and reduce fuel consumption and increase the total capacity of roads — but at the expense of losing our ability to drive.


My daughter and I first consulted with Dan Rosen after a very serious auto accident. Dan had several phone conferences with me, and Tracie was available whenever I called. We would recommend personal injury attorney Dan Rosen to anyone!
Sally from Denver, Colorado

Law Offices of Daniel R. Rosen

1400 16th Street #400, Denver, CO 80202
Open M-F 9am to 5pm.
(303) 454-8000 | Contact Us

Don't live in Denver?

We also have offices in Englewood, Colorado Springs and Greeley.
Embed this infographic:
Embed this image: