Tesla crash suit alleges ‘systematic fraud’ over Autopilot • The Register


    Five Texas residents have filed a lawsuit against Tesla and a local restaurant after an alleged drink-driver ploughed a Model X into the back of two parked police cruisers.

    The complaint [PDF] accuses the company of “defects in Tesla’s safety features,” the functionality of which has been “vastly and irresponsibly overstated” to “pump Tesla’s share price and sell more cars.”

    According to the suit, filed by the five police officers involved in the incident, the unnamed driver crashed his Tesla Model X into the back of two parked police cruisers at 70mph (112kph) after they had stopped to investigate a fourth vehicle for suspected narcotics offences in February.

    While there were no deaths as a result, the suit alleges the officers were “badly injured” and require compensation for “severe injuries and permanent disabilities they suffered as a result of the crash” when the parked vehicles were pushed forward into “six people and a German Shepherd.”

    Canine Officer Kodiak “had to visit the vet” while the five officers and a civilian were taken to hospital. The parked cruisers “were declared a total loss,” the suit claims.

    “Even though Autopilot was enabled at the time and the police cars had flashing lights, the Tesla failed to engage the Autopilot safety features to avoid the accident,” the suit continues. “The vehicle did not apply its ‘Automatic Emergency Braking’ to slow down to avoid or mitigate the accident.”

    The suit named two defendants: Tesla, for releasing what is claimed to be an overhyped and malfunctional safety system with a glaring blindspot for emergency vehicles with their flashing lights activated; and Pappas Restaurants, on allegations that the driver of the Tesla had “consumed alcohol to the point where he was obviously intoxicated, and he presented a clear danger to himself and others” yet “Pappasito’s Cantina continued to serve alcohol to him.”

    Not named in the suit, likely owing to an inability to contribute in any meaningful way to the $20m in combined damages sought by the plaintiffs, is the driver of the vehicle, who seemingly took the decision to drive while allegedly drunk and who failed to engage the own emergency braking system – their foot – before allegedly ploughing into the back of the parked vehicles.

    The suit claims the Model X, and by extension all the company’s other vehicles, is defective. “Tesla’s claims [about Autopilot and Automatic Emergency Braking] have been proven to be vastly and irresponsibly overstated, if not outright untrue,” the plaintiffs alleges, pointing to comments both from Tesla itself and from chief exec Elon Musk.

    Musk had appeared to react positively to the news that a couple had filmed themselves having sex in their Tesla while it was being driven under Autopilot control in 2019 – something predicted by Canadian car safety expert Barrie Kirk three years earlier.

    “Tesla is engaging in systematic fraud to pump Tesla’s share price and sell more cars, while hiding behind disclosures that tell the drivers that the system can’t be relied upon,” the suit alleges. “Tesla knows that Tesla drivers listen to these claims and believe their vehicles are equipped to drive themselves, resulting in potentially severe injuries or death.”

    While it would seem fairer to some to blame the crash on the driver, the suit claimed that the problem is widespread – highlighting 11 other cases of Tesla vehicles slamming into the back of parked emergency vehicles, seemingly as a result of the camera-based vision system being confused or even blinded by their flashing lights.

    “It is inconceivable that Defendant Tesla has not seen the publicly available reports regarding numerous crashes caused by its vehicles in relation to emergency vehicles with flashing lights,” the suit reads. “Tesla’s CEO has even referred to one of Tesla’s driver assistance systems as ‘not great.’ Defendant Tesla, Inc. and the company’s CEO, Elon Musk, were aware of numerous incidents regarding the ‘Autopilot’ system, but failed to recall the cars and fix the issue.

    “Tesla has intentionally decided not to remedy these issues and must be held liable and accountable, especially when it has detailed knowledge of the risks and dangers associated with its Autopilot system. Tesla has admitted that its Autopilot system will occasionally fail to identify a stopped emergency vehicle. But yet, Tesla made the decision not to recall any of its vehicles knowing that the Autopilot system was defective and posed an inherent risk of injury to the public, including first responders, and Tesla drivers.”

    It’s an accusation with which the US National Highway Traffic Safety Administration’s (NHTSA) Office of Defects Investigation is currently engaging. Earlier this year it announced a probe into the apparent tendency for Tesla S, Y, X, and 3 vehicles under Autopilot or Traffic Aware Cruise Control modes to fail to identify and avoid parked emergency vehicles with “vehicle lights, flares, an illuminated arrow board, and road cones.”

    National Transportation Safety Board chair Robert Sumwalt, meanwhile, summarised his organisation’s investigation into a fatal Model X crash in 2018 as showcasing “system limitations” in Autopilot while saying “it’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars. Because they don’t have driverless cars.”

    When faced with such criticism, Musk’s usual defence is to point to claims that Tesla’s Autopilot system is “ten times safer” than a vehicle under manual control, with one crash logged for every 4.19 million miles driven compared to the NHTSA’s average of one crash per 484,000 miles – something the suit alleges compares “apples to oranges” owing to Autopilot’s near-exclusive use during highway travel while “a large percentage of the crashes found in the NHTSA data occur off-highway.”

    “Further, when one considers that Musk excludes data where Autopilot was being used immediately before a crash but was disengaged at some point prior to the crash,” the suit continues, “Musk’s contentions are not only unpersuasive, they are misleading.”

    The suit makes a demand for a jury trial against both defendants, and seeks $10m for “actual damages for pecuniary losses, lost wages, loss of earning capacity, mental anguish, and past, present, and future medical expenses” plus an additional $10m in exemplary damages. It does not, however, detail the plaintiffs’ actual injuries.

    “Defendant Tesla, Inc. recommended, sold, and distributed the Autopilot system at issue. The subject product was defective and unreasonably dangerous in manufacture and marketing when it left the control of Tesla, Inc,” the suit concludes.

    “The system at issue failed to perform safely, as an ordinary consumer would expect when using it in an intended and/or reasonably foreseeable manner. The risk of danger inherent in the design of the Autopilot system outweighed the benefits of the design utilised. At all relevant times and at the time of injury, it was reasonably foreseeable by the Defendant that the Autopilot system would malfunction.”

    Neither counsel for the plaintiffs nor Tesla responded to a request for comment before the time of publication. ®





    Source link

    Previous articleChina Deals Bitcoin A Massive Blow
    Next articleCommon iOS 15 Problems and How to Fix Them