Entrepreneurs

Everyone Knows Tesla’s ‘Full Self-Driving’ Isn’t What Elon Musk Says It Is, According to the Government. Why It Matters

Elon Musk very much wants you to think that Tesla, where he is CEO, is close to delivering what it calls “full self-driving” vehicles. From the name, you would expect that means that you can get into your Tesla and it will handle the tediousness involved in getting you safely from one place to another.

Just this week, he tweeted that your Tesla will “Automatically drive to most obvious location unless occupant says otherwise.” Meaning, the default way of using a Tesla will be that it drives itself unless you choose otherwise. That’s a pretty big paradigm shift, and it’s where Tesla says it’s headed.

There’s a problem, however. “Full self-driving” doesn’t actually mean that a human isn’t required in the driver’s seat. It doesn’t mean that the vehicle is fully capable of driving itself, which is certainly what “full self-driving” sounds like. Sure, there’s some impressive technology going on, in that it’s able to navigate and maintain lanes.

But we’re still a long way from getting into a vehicle, buckling up, shutting our eyes, and taking a nap while a computer does all the work of, well, driving. Musk has even admitted that it’s a much harder problem than he anticipated, but that hasn’t stopped him from making promises Tesla hasn’t been able to keep.

Now, however, the head of the National Transportation Safety Board, Jennifer Homendy, says Tesla should stop. In an interview, Homendy said that using the term is “misleading and irresponsible,” and that it has “clearly misled numerous people to misuse and abuse technology.”

That’s because if you buy a vehicle with a feature called “full self-driving,” you might reasonably expect that it is fully capable of driving itself. Of course, if you read the fine print, Tesla lists out a series of things that your vehicle will be able to do, one of which is “auto steer on city streets,” a feature that is coming “later this year.” Really, you’re getting what Tesla calls a “full self-driving computer.” It’s basically a promise of future capabilities once the software gets there. 

Except, it’s misleading to characterize a product based on a feature it doesn’t yet have. That’s true for any product. A feature that doesn’t yet exist may never exist and you should never buy a product based on something it might be able to do in the future. 

By the way, that distinction matters more than you might think. You can argue that it’s a marketing term, but words have meaning. While it’s easy to dismiss many of Musk’s claims as showmanship, words matter and promises create expectations. 

While the words might mean one thing to Tesla, what really matters is the expectation it creates for people in the company’s vehicles, using them in situations where they could be put in danger because the technology isn’t on par with the marketing. 

When you overpromise, as Tesla has been doing when it comes to autonomous vehicles for a while, the logical consequence is that you lose trust. That’s a problem for any company, but especially one that is asking you to trust that the thing it is building will keep you safe when traveling down the highway at 70 miles per hour.

For Tesla, it’s an even bigger problem considering that government regulators are already investigating the company’s autopilot feature after a series of accidents that involved stationary emergency vehicles. At some point, if you keep overpromising and failing to deliver, you have to ask yourself whether you’re helping your cause or just creating hype.

The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.

Checkout latest world news below links :
World News || Latest News || U.S. News

Source link

Back to top button
SoundCloud To Mp3