Autonomous trucking technology is advancing rapidly. As with any emerging technology, questions arise when it fails or underperforms. A key challenge for the autonomous vehicle community is that technological development is outpacing legal frameworks, creating potential complications.
One major issue is product liability lawsuits, in which plaintiffs sue manufacturers or sellers, alleging a product caused harm, such as injury or financial loss. Defects can stem from manufacturing, design or marketing flaws. Ignoring these legal risks could lead to fines, settlements or claims costing millions to hundreds of millions of dollars.
Costs can escalate quickly. Recent autonomy-related lawsuits have focused on robotaxis and passenger vehicles. In August 2025, Tesla was found partially liable for a 2019 crash that killed one pedestrian and injured another while one of its vehicles was in Autopilot mode. A federal jury awarded the plaintiffs $43 million for pain and suffering, plus $200 million in punitive damages.
In that case, neither the Tesla driver nor the Autopilot software braked at an intersection with a stop sign and flashing red light. Plaintiff attorneys sued both the driver and Tesla, the maker of the vehicle and its technology.
Stakes could be even higher for autonomous truck manufacturers. Currently, original equipment manufacturers (OEMs) produce the hardware, software companies develop the virtual driver, and fleets operate the vehicles. This division fuels tension between OEMs and autonomous technology providers: When accidents occur on autonomous trucks, who bears responsibility?
This is not hypothetical. Commercial trucking fleets view accidents as inevitable, often citing crashes per million miles in safety statistics. The key question is what happens when — not if — they occur. The legal system has yet to fully resolve liability in autonomy cases. Until courts settle these issues, significant risks remain. Precedents require lawsuits, either litigated to verdict or settled privately, often under nondisclosure agreements.
To explore autonomy’s legal risks, FreightWaves spoke with Tray Gober, an Austin-based personal injury attorney who handles cases involving emerging autonomous and driverless vehicle accidents.
Gober observes that vehicle autonomy is not entirely new. Cruise control, introduced in the late 1950s and early 1960s, was an early example, allowing drivers to offload speed regulation to automation.
Early semi-autonomous vehicles faced challenges when drivers overestimated capabilities. “I remember as a kid jokes on TV and movies about people that bought an RV that had full cruise control and the driver comes back and is just mingling with the family because of misunderstandings of what that technology was capable of,” Gober said.
While modern autonomous vehicles far surpass cruise control, similar misunderstandings persist as manufacturers claim their virtual drivers perform at or above human levels.
Gober worries the technology is advancing too quickly. “Unfortunately, we’re seeing a rush to market, in my opinion. We’re seeing a rush to market by many of these AV companies that want to get early market share,” he said. “And as a consequence, why we’re seeing more of these incidents with AVs is these vehicles are being put out on the road with more autonomy than they can and should safely operate at.”
Trucks cause more severe accidents, in part due to physics. Gober describes plaintiffs’ lawyers using a “kitchen sink approach,” suing multiple parties up the supply chain to recover damages.
“It’s still new enough that certainly there will be product liability claims. If an AV big rig crashes into somebody and causes injuries, potential defendants in that case would be the software manufacturer — so whoever developed the AV technology, which a lot of times is also the vehicle manufacturer. It seems those two things are being partnered up right now,” Gober said.
He added that fleets operating autonomous trucks could also face liability for any negligence attributable to the AV.
Safety is a primary selling point for autonomous vehicles, with sensors providing 360-degree visibility, redundant computing, lidar, radar, cameras and faster reaction times promising fewer crashes. In liability law, however, superior safety does not provide immunity.
“I think sometimes people are surprised when plaintiffs’ lawyers are wanting fewer crashes. I genuinely want that. I genuinely want a safer world. And so I’m excited that AVs are bringing fewer crashes and fewer injuries. But that’s not the right standard. That’s not the standard in the law for holding accountable these product manufacturers,” Gober said.
The core question in product liability is different. “Because at its heart, this is a product liability claim when these vehicles crash. And the question is not, well, did this truck, did this tractor-trailer, this AV vehicle operate at a level safer than what a human driver would have operated at? That’s not the question that the jury will be asked in that case,” Gober said.
“The question the jury will be asked in a product liability case is: Was there a safer alternative design for that vehicle at the time that vehicle was produced?”
Autonomous vehicles face higher scrutiny than human drivers. Human behavior can complicate matters, including unwritten road rules. Current media attention focuses on robotaxis and passenger autonomous vehicles.
Gober’s firm has seen cases where routine human decisions challenged AVs.
“We just had in Austin several robotaxis that were bypassing, driving past school buses that had the stop sign and red lights flashing. Human drivers don’t make that mistake — well, human drivers do violate that law daily. But the AV basically didn’t know better because nobody had seemingly told it to know better,” Gober said.
“And that’s just a simple example of literally day-one newly licensed human driver knows to not drive past a school bus with the red lights on. And it’s not as though the technology on the AV robotaxis was being malicious or trying to get somewhere faster. It just didn’t know better because a human hadn’t told it to know better,” he added.
One illustrative example involves four-way stops. An autonomous vehicle correctly waited its turn, but human drivers often yield to larger vehicles about 70% of the time. This mismatch highlights challenges in sharing roads with imperfect human drivers. Gober noted that humans can also feel like test subjects for the technology.
“Unfortunately, crashes are going to happen, especially when you have this mix of human drivers and AVs that maybe are driving with a slightly different set of rules. And it’s going to be hard for human drivers — we’ve seen this in Austin — to anticipate driving habits of the AVs,” Gober said.
Challenges also arise when humans cause incidents. Gober cited an Austin case where a human driver crossed an intersection, struck another vehicle, then ricocheted into an AV. “Clearly, this was not the AV’s fault. The AV is just driving down the road, and another car ricochets into them,” he said.
“The question though is: What could that AV have done to have earlier anticipated that car ricocheting into it? And could it have stopped? Could it have swerved? Would a human driver have stopped or swerved?” Gober added.
“So again, it’s not that it’s the fault of the AV. It goes back to that standard: We don’t compare AVs to humans. We compare AVs to: Was there a safer alternative design at the time it was put into the marketplace?”
Product liability can ensnare even advanced safety features. Gober offered an example involving a human driver suffering a medical event, such as a seizure.
“It has been an acceptable reasonable safety technology for vehicles to have lane assistance — keep the vehicle within the lane of travel — and to have front-rear collision, forward-looking collision avoidance. So if you’re going to run into a wall just with the radar — that’s been an accepted technology since 2014,” he said.
“And so when we have a crash where a person is driving, it’s not the car’s fault that the individual passed out, had a seizure, is unable to safely operate the vehicle. But vehicles do have the available technology to keep the vehicle within the lane and slow the vehicle down before running into that big rig that’s in front of them on the interstate.”
If safety systems are absent or fail, plaintiffs can pursue product liability claims. Public acceptance aids such cases.
Gober cited client calls about crashes where airbags failed to deploy or seatbelts unbuckled. “Those are failures of safety systems that are required under this product liability standard of safer alternative design. It’s called a crashworthiness claim. Those safety features are required as a safer alternative design,” he said.
In the ricochet example, liability hinges not on fault but on whether the AV incorporated the best available safer alternative design to avoid or mitigate impact.
Crashworthiness is a longstanding concept in automotive product liability. Manufacturers use modeling and testing to assess it, often identifying improvements between model years.
“We see this all the time with auto manufacturers where they will develop a model vehicle, they will proceed to testing of that vehicle, and the vehicle passes the minimum requirements of the national federal testing. However, they identify: ‘We could make this better. This is a flaw. This is a design issue that we could improve.’ And you will see an improvement in next year’s model,” Gober said.
Internal knowledge of safer designs creates liability risks, even if vehicles meet federal standards. Manufacturers weigh fixed costs against lawsuit exposure using statistical analysis.
This framework applies to human-driven vehicles but remains untested for fully autonomous ones.
The push for federal autonomous vehicle guidelines reflects this uncertainty. Without them, states will determine fault, potentially leading to conflicting rulings and eventual higher court intervention.
OEMs favor federal rules for potential preemption of state claims. “Manufacturers are going to claim that: ‘Hey, we complied with federal rules and therefore it preempts any claims against us because we got approval from the federal government on this.’ You see that especially with product liability claims against drug manufacturers where they’ll say we got FDA approval,” Gober said.
Yet internal knowledge of safer designs can undermine such defenses. “Often what we find is that these companies know more than what the federal government did when a particular product was released to market,” he said.
Autonomous big rigs face elevated risks amid a decade of rising nuclear verdicts and liability claims. Plaintiffs increasingly target deep-pocketed manufacturers when trucking companies lack sufficient insurance or assets.
Examples include a September 2024 case where Daimler Truck North America received a $160 million nuclear verdict in a rollover crash.
FreightWaves’ John Kingston reported: “According to the original complaint filed by Street, and based on an interview with Benjamin Baker of the law firm of Beasley Allen, which represented the plaintiff, Street’s truck was driving on U.S. Highway 84 when a pickup truck heading the other way crossed the center line into Street’s lane. Street’s actions to avert a crash led to a rollover, and the force of the truck cab’s roof caving in led to neck injuries that left Street paralyzed.”
Trailer maker Wabash National faced a $462 million verdict after a drunk driver in 2019 crashed into a 2004-model trailer’s rear.
FreightWaves’ John Gallagher reported: “Wabash commented after the trial that the jury ‘was prevented from hearing critical evidence in the case, including that the driver’s blood alcohol level was over the legal limit at the time of the accident.’”
Gallagher added: “Wabash also pointed out that ‘the fact that neither the driver nor his passenger was wearing a seatbelt was also kept from the jury, even though plaintiffs argued both would have survived a 55-mile-per-hour collision had the vehicle not broken through the trailer’s rear impact guard.’”
The Wabash judgment was later reduced, with a settlement of approximately $30 million, below the final $119.5 million court judgment and original $462 million jury verdict.
In both cases, OEMs faced massive liability despite others’ actions causing the incidents. For autonomous truck makers and technology partners, the legal landscape remains uncharted, but precedents will likely emerge through litigation — with profound implications for automation in trucking and supply chains.
The post Autonomous trucking faces growing product liability risks appeared first on FreightWaves.
Leave a Comment
Your email address will not be published. Required fields are marked *