Tesla Supercharger stations are seen in a parking lot in Austin, Texas, on Sept. 16, 2024.

Brandon Bell | Getty Images

Tesla is being sued by the family of a driver who died in a 2023 collision, claiming that the company’s “fraudulent misrepresentation” of its Autopilot technology was to blame.

The Tesla driver, Genesis Giovanni Mendoza-Martinez, died in the crash involving a Model S sedan in Walnut Creek, California. His brother, Caleb, who had been a passenger at the time, was seriously injured.

The Mendoza family sued Tesla in October in Contra Costa County, but in recent days Tesla had the case moved from state court to federal court in California’s Northern District. The Independent first reported on the venue change. Plaintiffs generally face a higher burden of proof in federal court for fraud claims.

The incident involved a 2021 Model S, which smashed into a parked fire truck while the driver was using Tesla’s Autopilot, a partially automated driving system.

Mendoza’s attorneys alleged that Tesla and Musk have exaggerated or made false claims about the Autopilot system for years in order to, “generate excitement about the company’s vehicles and thereby improve its financial condition.” They pointed to tweets, company blog posts, and remarks on earnings calls and in press interviews.

In their response, Tesla attorneys said the driver’s “own negligent acts and/or omissions” were to blame for the collision, and that “reliance on any representation made by Tesla, if any, was not a substantial factor” in causing harm to the driver or passenger. They claim Tesla’s cars and systems have a “reasonably safe design,” in compliance with state and federal laws.

Tesla didn’t respond to requests for comment about the case. Brett Schreiber, an attorney representing the Mendoza family, declined to make his clients available for an interview.

There are at least 15 other active cases focused on similar claims involving Tesla incidents where Autopilot or its FSD — Full Self-Driving (Supervised) — had been in use just before a fatal or injurious crash. Three of those have been moved to federal courts. FSD is the premium version of Tesla’s partially automated driving system. While Autopilot comes as a standard option in all new Tesla vehicles, owners pay an up-front premium, or subscribe monthly to use FSD.

The crash at the center of the Mendoza-Martinez lawsuit has also been part of a broader Tesla Autopilot investigation by the National Highway Traffic Safety Administration, initiated in August 2021. During the course of that investigation, Tesla made changes to its systems, including with a myriad of over-the-air software updates.

The agency has opened a second probe, which is ongoing, evaluating whether Tesla’s “recall remedy” to resolve issues with the behavior of Autopilot around stationary first responder vehicles had been effective.

NHTSA has warned Tesla that its social media posts may mislead drivers into thinking its cars are robotaxis. Additionally, the California Department of Motor Vehicles has sued Tesla, alleging its Autopilot and FSD claims amounted to false advertising.

Tesla is currently rolling out a new version of FSD to customers. Over the weekend, Musk instructed his 206.5 million-plus followers on X to “Demonstrate Tesla self-driving to a friend tomorrow,” adding that, “It feels like magic.”

Musk has been promising investors that Tesla’s cars would soon be able to drive autonomously, without a human at the wheel, since about 2014. While the company has shown off a design concept for an autonomous two-seater called the CyberCab, Tesla has yet to produce a robotaxi.

Meanwhile, competitors including WeRide and Pony.ai in China, and Alphabet’s Waymo in the U.S. are already operating commercial robotaxi fleets and services.

WATCH: Tesla FSD tests were ‘incredibly good’

Tesla's FSD tests were 'incredibly good' and we're optimistic on the growth potential: BofA's Murphy