Fatal Tesla crash raises doubts about camera-based autonomous systems again

Tragedy in Arizona: Tesla FSD under scrutiny

In November 2023, a tragedy occurred in Arizona: a 71-year-old woman who had stopped on the highway to assist other drivers was struck by a Tesla. According to reports, the car was in Full Self-Driving (FSD) mode. This is the only case in the past year where the system may have been responsible for a pedestrian fatality.

Footage from the Tesla’s camera shows that the road was obscured by sunlight. While other vehicles had already slowed down and witnesses attempted to warn drivers, the Tesla ignored the danger. The driver, Karl Stock, stated that he couldn’t react in time due to the speed of events.

Can vision-based systems be safe?

This crash highlights a major weakness of vision-based autonomous systems: like humans, they can fail to see the road due to glare, fog, or smoke. Unlike them, radar and lidar can detect obstacles even in challenging conditions. However, even such technologies are not perfect—Cruise, which used radar, has also faced accidents.

“The Tesla vehicle crashed while operating in FSD mode under limited visibility due to sunlight glare,” notes NHTSA.

Robotaxi and the future of autonomous vehicles

Tesla is preparing for a large-scale Robotaxi launch in Austin, but incidents like this cast doubt on the technology’s readiness. Recently, a Tesla with FSD even veered off the road without reason and crashed into a tree. If the company wants autonomous vehicles to become mainstream, such situations must not occur.

The company insists that its systems are safer than human drivers, but there is no independent evidence to support this. The question remains open: what is the safest path to autonomy, not just in the future but right now?

Recent events show that even the most advanced technologies require refinement. For now, we’ll have to wait and see if Tesla can resolve these issues before the large-scale rollout of Robotaxi.

Leave a Reply