Tesla sued for 'defective' Autopilot in wrongful death suit of Florida driver who crashed into tractor trailer

The accident had similarities to a 2016 Florida fatal crash.

The family of a 50-year-old Tesla driver who died when his car collided with a semitrailer in Florida, is suing the automaker for its allegedly "defective" Autopilot and safety features.

Lawyers for Jeremy Banner filed a wrongful death lawsuit in Palm Beach County on Thursday. Banner died on March 1, 2019, when his 2018 Tesla Model S crashed into a semitrailer crossing its path on a Florida highway, shearing off the roof. He had turned on autopilot 10 seconds before the crash, according to an National Transportation Safety Board (NTSB) investigation.

"We're not just talking about the consequences of this effect to the Banner family, which are horrific," the Banner family's attorney Lake H. Lytal III told reporters at a press conference on Thursday afternoon. The point is "to open people's eyes and make them realize that these products are defective. We've got hundreds of thousands of Tesla vehicles on our roadways right now. And this is what's happening."

In an email to ABC News, Tesla referred to its May statement about Banner's accident.

"Shortly following the accident, we informed the National Highway Traffic Safety Administration and the National Transportation Safety Board that the vehicle’s logs showed that Autopilot was first engaged by the driver just 10 seconds prior to the accident, and then the driver immediately removed his hands from the wheel," a Tesla spokesperson said in the statement. "Autopilot had not been used at any other time during that drive. We are deeply saddened by this accident and our thoughts are with everyone affected by this tragedy."

In the lawsuit, Lytal referred to a similar accident in 2016 involving a Tesla S on autopilot crashing into a semitrailer on the same Florida highway, killing driver Joshua Brown.

"I'm going to take you through our accident, and it just kind of lets everybody know how similar it is to the failure that happened in 2016, in the same product defect that they dealt with in 2016. The same problems are happening right now," Lytal said.

In Banner's final moments, neither the system nor the driver stopped the vehicle, which was traveling at about 68 miles per hour at the moment of impact, NTSB investigators said. His hands were not on the wheel in the final eight seconds before the crash, according to their report.

The lawsuit says Banner believed his Model 3 "was safer than a human-operated vehicle because Defendant, Tesla claimed superiority regarding the vehicle’s autopilot system, including Tesla’s 'full self-driving capability,' Tesla’s 'traffic-aware cruise control,' Tesla’s 'auto steer lane-keeping assistance' and other safety-related components" would "prevent fatal injury resulting from driving into obstacles and/or vehicles in the path of the subject Tesla vehicle," the lawsuit said.

Tesla "specifically knew that its product was defective and would not properly and safely avoid impacting other vehicles and obstacles in its path," the lawsuit claimed, adding that the company "had specific knowledge of numerous prior incidents and accidents in which its safety systems on Tesla vehicles completely failed causing significant property damage, severe injury and catastrophic death to its occupants."

In addition to Brown's 2016 crash, Tesla has been under fire for several fatal crashes that occurred while cars were in Autopilot mode, including: a Jan. 22, 2019 crash in which a Tesla collided into a fire truck in Culver City, Los Angeles, a fatal March 23, 2018 crash involved two other cars in Mountainview, California, a May 8, 2018 fatal crash in Ft. Lauderdale, Florida, a May 11, 2018 crash in Utah, a May 29,2018 crash in Laguna Beach, California, an Oct. 12, 2018 crash on the Florida Turnpike.

In its statement to ABC News, Tesla said its cars are safe when Autopilot is used properly.

"Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance. For the past three quarters, we have released quarterly safety data directly from our vehicles which demonstrates that,” the Tesla spokesperson said.

In June the Insurance Institute for Highway Safety (IIHS) released a study revealing how the names manufacturers use - particularly 'autopilot' - "can send the wrong message to drivers regarding how attentive they should be."

The survey found that the phrase 'autopilot' "was associated with the highest likelihood that drivers believed a behavior was safe while in operation, for every behavior measured, compared with other system names."

ABC News’ Mina Kaji contributed to this report.