While Tesla's Autopilot feature remains a subject of interest and intrigue, especially as the conversation regarding autonomous vehicles continues to heat up, in the background, serious problems are emerging and lawsuits are being filed that claim Tesla is proving a danger on the roads.
The latest lawsuit comes from a family from California whose 15-year old son was killed when the vehicle he was traveling in was hit from behind by a Tesla whose Autopilot feature failed to work before impact. What makes this lawsuit different is that the Tesla captured a six-second video, and also recorded data from the collision, that prove the system didn't function like it was supposed to.
In an article for the New York Times, Neal E. Boudette speaks to experts about the increasing concerns from consumers and auto-industry experts alike. As the lawsuits continue to pile up, and more fatalities occur as a result of collisions involving Tesla's that utilize Autopilot, experts and family members who have lost loved ones are desperate for their concerns to be taken seriously.
"[The accident] is one of a growing number of crashes involving Autopilot that have fueled concerns about the technology's shortcomings, and could call into question the development of similar systems used by rival carmakers."
As it stands, the National Highway Traffic Safety Administration has more than two dozen active investigations open into collisions that involved Autopilot. And with regards to Tesla in particular, there have been at least 3 deaths that can be attributed to Autopilot failing to engage. Two of those deaths involved Autopilot failing to recognize another vehicle, while the other involved Autopilot failing to recognize a concrete barrier. And those are only the deaths attributed to Tesla's Autopilot feature. NHTSA have listed at least another 10 people who have been killed as a result of Autopilot failing to engage since 2016.
Part of the problem, as Boudette points out, is that Tesla has far bigger goals in their sights, with full self-driving technology promised in the near future. Incidents like these shake people's confidence in not only Tesla, but the possibility of full self-driving technology at all. As it stands, many argue that Tesla and Elon Musk are promising what is just not possible at this time, as many believe the technology necessary for making self-driving vehicles possible is many years away.
Should Tesla, or any other company, press on towards full-self driving technology, they would be doing so in the face of problems that exist in the current technology, problems that have a real-life impact. While Tesla has maintained that when collisions occur, the driver is to be blamed, not Autopilot, this most recent fatality in California suggests otherwise.
It is true that many use Autopilot technology incorrectly. Engaging Autopilot does not give drivers permission to check out, instead it is meant to help carry out some of a vehicle's functions.
"Autopilot is not an autonomous driving system. Rather it is a suite of software, cameras and sensors intended to assist drivers and prevent accidents by taking over many aspects of driving a car - even the changing of lanes."
While drivers can be more relaxed than usual when Autopilot is engaged, completely checking out, as we've seen with people who have been charged for reading and even sleeping while Autopilot was on, is incredibly dangerous. Should Autopilot fail to recognize hazards, or fail to respond, as happened in California, severe injuries and even fatalities can and do occur.
We can sure at this point that Autopilot and self-driving technology is not going away. The future, as we are often told, is autonomous, but that doesn't mean that we should ignore or dismiss the problems that exist with the technology as it stands right now. Demanding better from Tesla and other companies is perhaps more important now than it has ever been. The future is inescapable, but making our voices heard now can potentially ensure that that future is safer and better than the one that seems to be on the horizon now.
For more, check out the original article in the New York Times by Neal E. Boudette.
Comments