A recent jury decision in Florida has sent ripples through the auto industry, raising big questions about the safety of autonomous driving systems and the responsibilities automakers carry when drivers misuse them.
On August 1, a Miami jury found Tesla partly liable for a 2019 crash that killed a woman and seriously injured her boyfriend. The recent verdict was the first time a jury has found the electric carmaker legally responsible for the misuse of its Autopilot feature in human history, per AP News.
Yet, the driver in the case, George McGee, admitted to being distracted by his phone just before his Tesla slammed into a parked car at over 60 miles per hour on a dark stretch of road. While McGee took responsibility for his actions, he also testified that he relied heavily on Autopilot, believing the vehicle would automatically warn him or brake if an obstacle was in the way.
“I trusted the technology too much,” he told the court.
Jurors agreed that Tesla’s software was partially responsible, concluding that the company failed to include proper safety guards to prevent misuse of the system. The case is particularly impactful because McGee used the system on a road that “autopilot” was not designed to handle, according to Axios, something the plaintiffs argued Tesla should have made impossible through better design limits.
Tesla, which plans to appeal, said in a statement that the verdict “is wrong.”
The ruling comes as advanced driver-assistance systems (ADAS) like adaptive cruise control, lane-centering, and hands-free highway driving become increasingly common in cars. Automakers, including Ford, GM, BMW, and Hyundai, are all investing heavily in these features, hoping to market them as high-tech conveniences that ease the burden of daily driving.
But some safety experts argue that the gap between what these systems can do and what drivers think they can do is dangerously wide.
According to the Insurance Institute for Highway Safety (IIHS), features such as automatic emergency braking and lane-keeping assist do show some measurable safety benefits. However, some partially autonomous systems, especially those that reduce driver engagement, can reportedly pull users into a false sense of security.
Research further shows that many drivers treat these systems as if the car can drive itself, despite warnings to remain alert and keep their hands on the wheel.
IIHS recently rolled out its first-ever safety ratings for partially automated driving systems, evaluating how well they monitor driver attention and respond in emergencies.
The results showed that nearly all major automakers earned “marginal” or “poor” scores. Only Lexus achieved an “acceptable” rating: none received the IIHS’ top-tier “good” grade for their ADAS systems.,
In response to the growing concerns, the National Highway Traffic Safety Administration (NHTSA) announced this week that it has launched a new study into driver monitoring systems.