Tesla famously describes its cars as self-driving, which they are not. It’s a fact that CEO Elon Musk regularly muddies. The disconnect between Musk’s promises and Tesla’s reality is the focus of “Elon Musk’s Crash Course,” a film directed and produced by Emma Schwartz.
She recalls May 2016 when former Navy SEAL Josh Brown died while driving a Tesla Model S on autopilot. A tractor trailer crossed the road he was on in rural Florida, but his car didn’t slow down. That was the first fatal car collision involving autopilot in the U.S., and it helped prove that self-driving technology wasn’t designed for highways that weren’t fully enclosed, she says.
The accident was investigated by the National Highway Traffic Safety Administration (NHTSA), which drew the ire of Musk, who threatened to sue the organization and accused it of unfairly singling out Tesla.
Schwartz says there is some credence to Tesla receiving good and bad press due to its disruptive force in the car industry. But she points out that its self-driving technology was new at the time of Brown’s accident and drew attention from government regulators, some of whom were excited about it.
“If you're sitting in the position as a safety regulator, and you see a fatality involving this new technology, it is something different because this has the potential to change the way driving can happen. And understanding whether or not it's being deployed safely is something that a lot of the government officials expressed to me as being important to look into.”
Tesla’s self-driving technology isn’t considered full autopilot because the car can’t do everything on its own, Schwartz explains. While a driver can let go of the steering wheel and the car can stay in its own lane, there are still many circumstances it can’t account for. Despite that fact, Tesla describes its mechanization as a full self-driving system, contributing to the confusion about its ability.
“There's a big gap in terms of what the public perception is about what [self-driving] means. … And so the tension that a lot of the car companies are trying to grapple with is: How do you keep the driver engaged if you're trying to make them less engaged? It's a bit of a catch-22.”
Musk often says perfect self-driving technology is two years away. Schwartz doubts that’s plausible.
“It's still a long way out, especially with the method that Tesla claims to be using. By using just cameras, it's a more complicated way to go about it. They're not using radar and they're also not using LiDAR. They're essentially relying on having more computer power to figure it out.”
But she says machines are not always smarter than humans.
“There are probably times where you can create a machine that is safer than a human. I think the challenge … is that we just have a couple of basic numbers that they [Tesla] put out,” Schwartz says. “It's saying, ‘Trust us. Trust us that these numbers are the fairest and most accurate way to tell you that the product we sell is the safest. And I think any journalist has to be skeptical of any numbers that are just put out by a company, especially when they're not verified, and when there's not a lot of detail to back that up.”