Tesla is once again at the center of a scandal due to a fatal accident involving the Autopilot system. Lawyers are filing a new wrongful death lawsuit, claiming that Autopilot failed to recognize a motorcyclist, leading to a collision.
Details of the Tragic Incident
According to the law firm Osborn Machler & Neff, 28-year-old Jeffrey Nissen Jr. was riding his motorcycle on the evening of April 19, 2024, on State Route 522 in Snohomish County, Washington. He stopped due to traffic congestion and was struck from behind by a Tesla Model S driven by Carl Hunter.
It is reported that Hunter initially did not realize a collision had occurred and continued moving forward. This pinned the motorcyclist under the car, which likely caused his death.
The Role of the Driver and the Autopilot System
According to the attorneys, the Tesla driver initially told emergency services he did not know how the accident happened. However, he later admitted that he was relying on the Autopilot system and may have been distracted, looking at his phone at the moment of impact. Following this admission, he was arrested on suspicion of vehicular homicide.
Tesla created a system that encourages distraction. Drivers think the car can do more than it actually can, and the results are predictable and tragic.
The lawyers claim that Tesla has long been aware of the Autopilot system’s problems with identifying motorcycles and other small vehicles. They add that the company overstated the system’s capabilities, downplayed its limitations, and encouraged drivers to trust the system in situations it cannot handle safely.

System Warnings and Responsibility
Despite the stated shortcomings, the attorneys hint that the car did detect something, as, in their opinion, there is evidence that Hunter dismissed or ignored a Forward Collision Warning prior to the crash.
Tesla positions Autopilot as an advanced driver-assistance system that enhances safety and comfort while driving. However, the company emphasizes that the system is intended for use by a fully attentive driver and does not turn a Tesla into a fully autonomous vehicle.

Marketing Issues and Legislative Problems
Before activating Autopilot, drivers must agree to keep their hands on the steering wheel and maintain control and responsibility for the vehicle. Although Tesla has changed the wording over time, archived data shows that at the time of the accident, the company’s website stated that the systems are intended for use by a fully attentive driver who is ready to take over at any moment.
However, last month, the California Department of Motor Vehicles found that Tesla violated state law by using the terms “Autopilot” and “Full Self-Driving” in marketing its electric vehicles. The state authority objected to wording on Tesla’s website that claimed the system was capable of performing short and long trips without any action from the person in the driver’s seat, as this is not true.

This case once again raises complex questions about the distribution of responsibility between the driver and semi-autonomous control systems. Technologies are evolving rapidly, often outpacing legislative regulation and the formation of clear rules for their use. Tragic events like this become focal points around which public discussions and legal precedents crystallize, shaping the future of human-machine interaction on the roads. The success of such lawsuits could significantly impact not only Tesla’s policies but also the approach to marketing and developing autopilots by all automakers.

by