Two U.S. senators urged safety regulators on Monday to launch a formal probe into Tesla’s “Full Self-Driving” software after an NBC News investigation earlier this month reported complaints from Tesla drivers who said the software sometimes failed to detect oncoming trains at railroad crossings.
Sens. Edward Markey, D-Mass., and Richard Blumenthal, D-Conn., issued the request in a letter to the National Highway Traffic Safety Administration (NHTSA), which oversees vehicle safety. They criticized the semi-autonomous software, which Tesla sells under the name Full Self-Driving (Supervised).
“Because collisions between trains and cars often cause significant fatalities and injuries, FSD’s failure to safely navigate railroad crossings creates serious risk of a catastrophic crash,” the senators wrote.
“We urge NHTSA to immediately launch a formal investigation into this disturbing safety risk and take any necessary action to protect the public,” they wrote.
NHTSA did not immediately respond to a request for comment on the letter. The agency said in a statement earlier this month that they were aware of the incidents and had raised the issue with Tesla.
Tesla did not immediately respond to a request for comment. The company says in its manual that FSD requires active human supervision at all times.
Tesla CEO Elon Musk said on X last week that he planned to release a new version of the FSD software this week. It’s not clear how the new version would be different or whether it would address driver complaints about rail crossings.
In interviews with NBC News, six Tesla drivers who use FSD said they experienced problems with the technology at rail crossings, and four of them provided videos. NBC News also found seven other Tesla driving videos posted online showing similar mishaps, as well as 40 written complaints on Tesla internet forums and social media.
Tesla drivers, including some otherwise-satisfied customers, said they were frustrated to still experience the errors despite similar complaints appearing online since at least 2023.
In one example cited by the senators, video from 2024 showed a Tesla in Full Self-Driving mode failing to detect a moving train or stop on its own, slamming into a crossing arm and skidding off the road in Ohio.
Experts in autonomous technology said that Tesla’s FSD software is a black-box AI model in which errors can’t be easily explained even by its creators and that Tesla engineers most likely hadn’t included enough railroad crossing examples in the videos they used to train the FSD software.
Citing the reporting, the senators wrote: “These terrifying incidents demonstrate both the limitations of Tesla’s technology and confusion around the branding of FSD.” They added that they did not believe NHTSA speaking with the manufacturer was a sufficient response.