What we mean when we say ‘semi-autonomous’
Too often it takes a tragedy to make us wake up and take notice of what’s already all around us.
That’s what happened when news broke of the death of Joshua Brown, 40, of Canton, Ohio. He became the first person to die behind the wheel of a semi-autonomous car. His Tesla Model S was using Autopilot when a truck crossed its path. Neither the car nor Brown noticed the white tractor trailer against the bright sky.
The deluge of news and misinformation following the crash proved we’re overdue for a debate about driverless cars, about what they are, how they’re sold, and who is ultimately responsible when things go wrong, which they will.
A crash like Brown’s was only a matter of time.
Semi-autonomous cars are already on our roads in the form of the Tesla Model S and the Mercedes-Benz E-Class. Other major auto makers are rapidly bringing similar technology to dealerships, although this crash could delay the rollout. Semi-autonomous driving is reserved for high-end vehicles but, as with any new gadget, it will rapidly become more affordable and therefore more ubiquitous. We’d better get ready.
What does it feel like to drive a car with Autopilot?
A double tap on the cruise control lever behind the steering wheel on a Tesla Model S is all it takes to engage Autopilot. The digital dashboard lights up with a myriad of symbols and icons, not all of which have an obvious meaning. If it’s a leap of faith to take your feet off the pedals, it’s a running sprint off the high diving board, blindfolded, to take your hands off the steering wheel. It moves by itself as the car rounds a bend in the highway. It’s terrifying and exciting and unnatural and serene in that first moment when a Tesla drives itself.
But it’s an illusion. The Tesla with Autopilot is not driving itself, although it might seem like it. You’re not supposed to take your hands off the steering wheel. The car notices and eventually tells you to grab the wheel. If you don’t, it will begin to slow down.
Tesla tells customers the human driver is always responsible. This is why a Model S or a Mercedes E-Class is a semi-autonomous vehicle, not autonomous or driverless or self-driving or anything else. It’s an important distinction, one often lost in the news headlines.
“When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times,’ and that ‘you need to maintain control and responsibility for your vehicle’ while using it,” Tesla Motors said in a statement released following the crash.
That’s a lot of fine print. Is it right for Tesla to call this system Autopilot when it is so far from being an actual automated driving-pilot? The name brings to mind airline captains kicking back in the cockpit, reading a newspaper, doing a crossword puzzle. But Autopilot as it currently exists in a Tesla is not autopilot.
From the driver’s seat, it can be confusing.
In a half-day of Autopiloted driving on Toronto’s busiest highways, I learned to trust Tesla’s system quickly after the initial leap of faith. It works well, most of the time. I got used to seeing a car in front slow down, and not moving my foot over to the brake pedal – the Tesla slows down automatically. It turns smoothly without bouncing from side-to-side in the lane. The one time during my brief drive when Autopilot didn’t work, accelerating the Model S toward the metal guardrail of a highway off-ramp, it was my fault. The current version of Autopilot only works on the highway, not entrances or exits, but the system didn’t always warn me when I tried to use it in a situation it wasn’t designed for.
Autopilot driving can also be extremely boring. There’s not much for a driver to do: watch the mirrors, look out the window, keep a finger on the steering wheel. But it doesn’t feel like you’re driving, even though legally speaking, you are. The temptation to check your phone or read something, anything, is immense. But you can’t: it’s still illegal, Autopilot or no.
Driving the Mercedes E-Class with similar semi-autonomous technology – called Drive Pilot – I forgot the system was switched off when I saw the car in front slow down. I waited for the Mercedes to slow itself. It didn’t and I realized my error, belatedly slammed on the brakes the old-fashioned way.
Details on what exactly happened to Joshua Brown and his Tesla are still scarce. If the car’s forward-looking cameras were blinded by a white object on a white background, why didn’t the car’s radar see the truck as an obstacle, and warn Brown or brake automatically? Why didn’t Brown hit the brakes? There are too many unknowns to speculate. An ongoing investigation into the crash by U.S. National Highway Traffic Safety Administration should provide some answers.
What we do know is that nearly 2,000 people died and 10,000 were seriously injured as a result of a traffic collision on Canadian roads in 2013, according to a Transport Canada report. The roads are dangerous. Autonomous and semi-autonomous cars have the potential to reduce those numbers making our roads safer. But it could take decades, not model-years. Joshua Brown was the first, but he likely won’t be the last.
Like the early days of the Internet, self-driving car technology is progressing faster than the laws and norms which will govern it. If there’s a silver lining in Brown’s death, it’s that now, finally, we’ll all wake up and figure out who is responsible when semi-autonomous and autonomous cars go wrong.