<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=395457600911184&amp;ev=PageView&amp;noscript=1">

Autonomous car software failure: Who will be to blame?

The concept of autonomous and semi-autonomous cars is a very interesting one that has inspired a great deal of discussion lately. One frequently-voiced concern is the question of responsibility in the case of a malfunction of autopilot software that leads to an accident, injury or death. This question has been brought further into the light recently following a complaint filed against Tesla, Elon Musk’s ambitious car company.

 

The complaint against Tesla

The complaint, which was filed in San Jose, California on April 19, comes from three owners of Tesla cars that came with what the company calls “Enhanced Autopilot” technology, which they claim was not functional and which also didn’t include standard safety features. According to the content of the complaint, Tesla customers who purchased vehicles with this optional autopilot software feature paid a $5,000 premium over the standard price of the car, only to find that the software was not yet functioning as correctly and safely as they were led to believe, with further complications arising with updates to the software from the company.

Tesla responds to claims of dangerous software

Tesla has responded to the lawsuit in a dismissive fashion, calling it a “disingenuous attempt to secure attorney’s fees posing as a legitimate legal action, which is evidenced by the fact that the suit misrepresents many facts”. The company also released a statement explaining that the Enhanced Autopilot software is still being updated frequently and now already includes some of the functions that were claimed to be missing in the pending complaint.  

The statement reads, in part, “We have always been transparent about the fact that Enhanced Autopilot software is a product that would roll out incrementally over time, and that features would continue to be introduced as validation is completed, subject to regulatory approval. The inaccurate and sensationalistic view of our technology put forth by this group is exactly the kind of misinformation that threatens to harm consumer safety.”

Questions of responsibility remain

While enlightening in regards to this case, this situation still begs the question of who will be held at fault if the autonomous future of driving software does prove to be fraught with malfunctions that make it less safe for drivers. For now, it seems that only time will tell.  In the meantime, Tesla continues to gain in public interest and popularity, with 400,000 consumers already on their waiting list for their upcoming Model 3 sedan, which will go into production later this year. Musk himself has assured loyal and future customers that the models of Tesla vehicles that were sold with the hardware required for the Enhanced Autopilot software, which includes sonar, cameras, and sensors, will eventually be capable of self-driving 

If you or someone you know have experienced related issues to autonomous vehicle software or operation, we’d like to hear about your case. Contact us today for a free consultation by submitting our convenient contact form. 

Recent Posts

    Archives

    View All