- Share this article on Facebook
- Share this article on Twitter
- Share this article on Email
- Show additional share options
- Share this article on Print
- Share this article on Comment
- Share this article on Whatsapp
- Share this article on Linkedin
- Share this article on Reddit
- Share this article on Pinit
- Share this article on Tumblr
When Tesla Motors’ CEO Elon Musk introduced Autopilot in October, he cautioned that the software, which allows Tesla’s Model S and Model X to steer themselves and automatically change lanes, was a beta release.
“We’re advising drivers to keep their hands on the wheel,” said Musk. “You need to be ready to take the wheel at any time.”
After three months and a host of YouTube videos depicting Tesla drivers shaving, eating and even riding in the backseat while leaving the driving to Autopilot — along with several harrowing near crashes and increasingly vocal criticism from competitors — Musk, acknowledging the “crazy things” drivers were doing with it, said that Tesla will release an update that will limit Autopilot in some fashion.
“This is not good,” Musk said in a earnings call with financial analysts. “We will be putting some additional constraints on when Autopilot can be activated to minimize the possibility of people doing crazy things.”
The updated Autopilot will apparently be restricted to speeds below the posted speed limit, according to the website Teslarati, citing reports it received from Tesla owners that had tested a beta version of the new Autopilot. The presumed change did not go over well with some in the Tesla community. “If there’s an update that removes a functionality that is useful for me, then I’m not going to install it,” a Model S owner posted on an online forum.
Autopilot works by gathering information about nearby cars, lane position and other data from ultrasonic sensors, a forward-facing camera, radar and GPS wedded to mapping software. But, as Musk conceded at Autopilot’s introduction, it bogs down when lane markings are unclear and in heavy precipitation.
Competitors developing their own autonomous driving systems question Tesla’s decision to release Autopilot as a work-in-progress, like a beta smartphone app.
BMW CEO Harald Kruger recently told the German newspaper Handelsblatt that “in the app industry, you can launch products on the market that are 70 to 80 percent ready and then complete their development with the customer. That is absolutely impossible with safety features in a car.”
Volvo is deep in development of autonomous driving — it plans to have 100 fully autonomous vehicles on Swedish roads by the end of 2017 — and its lauded XC90 model has several semi-autonomous functions, but more will be added only when they are irrefutably proven to be safe, Tisha Johnson, Volvo’s senior director of design for North America, told The Hollywood Reporter at the Los Angeles Auto Show in October.
“You’re building trust,” said Johnson. “When it comes to autonomous driving, we wouldn’t put it out unless we were sure it could be done.” Referring to an Autopilot feature that checks for traffic and automatically steers the car into an adjacent lane, Johnson said, “We won’t do lane changes until we know we can do them correctly and safely, guaranteed.”
Given that autonomous driving is still in its infancy and faces legal and liability challenges, Tesla’s implementation of Autopilot without user constraints was “very irresponsible,” Jaguar project manager Stephen Boulter said in an interview with Mashable. “If something happens [with Autopilot], it could set the technology back a decade.”
Jaguar has essentially the same technology as Autopilot but has resisted releasing it because of safety concerns, said Boulter.
Musk told analysts that early data collected from the roughly 40,000 Autopilot-equipped Teslas, “is very positive. We’re aware of many accidents that were prevented by Autopilot and we’re not aware of any that were caused by Autopilot. It appears to be quite beneficial from a safety standpoint.”
Sign up for THR news straight to your inbox every day