Italo Frigoli was trying out the Full Self-Driving software of his Tesla one evening in June when he came upon a railroad crossing. The arms were descending and lights were flashing as a train barreled toward the intersection.
For most human drivers, the gate arms and lights are clear signals to stop, but for Frigoli’s Tesla, which was driving in a semiautonomous mode, the potentially deadly situation didn’t seem to register.
“It felt like it was going to run through the arms,” he said. “So obviously I just slammed on the brakes.” He stopped just a few feet from the crossing near his home in North Texas, barely avoiding disaster.
Video from the car’s cameras, reviewed by NBC News, appears to support his account. And this month, when NBC News accompanied him to the same railroad crossing, his Tesla software had the same problem. While cameras were rolling, his Tesla’s software failed to detect an oncoming train, forcing Frigoli to manually brake.
Frigoli avoided the potential crashes, but his experiences highlight a recurring complaint among some Tesla drivers about the company’s self-driving technology: The software sometimes mishandles railroad crossings, including by failing to stop for them.
Tesla’s Full Self-Driving (FSD) software is an add-on package of driver-assistance features that the company touts as “the future of transport,” capable of navigating “almost anywhere with your active supervision.”
In interviews, six Tesla drivers who use FSD said they experienced problems with the technology at rail crossings, and four of them provided videos. NBC News also found seven other Tesla driving videos posted online showing similar mishaps dating back to June 2023 through August. Those drivers declined to be interviewed.
The complaints are even more widespread on Tesla internet forums, where drivers describe similar mishaps without usually posting videos. NBC News found 40 examples on Reddit, X and YouTube since 2023, including posts as recent as August.
Regulators with the National Highway Traffic Safety Administration (NHTSA) told NBC News in a statement that they had raised the issue with Tesla.
“We are aware of the incidents and have been in communication with the manufacturer,” the agency said.
“NHTSA discusses issues frequently with manufacturers and prioritizes the safety of all road users,” the statement went on. “The agency continuously analyzes consumer complaints to determine whether a potential vehicle safety defect trend exists. We will continue to enforce the law on all manufacturers of motor vehicles and equipment, in accordance with the Vehicle Safety Act and our data-driven, risk-based investigative process.”
Musk has said autonomous technology is crucial to Tesla’s future, and he has bet the company’s future on the success of self-driving cars and artificial intelligence-powered robots. Tesla robotaxis are on the road in Austin, Texas, and planned for other cities. In July, Musk said that an unsupervised version of FSD — one that doesn’t require monitoring by a human driver — could be available this year “in certain geographies” and that his goal is to have Tesla robotaxis available to half the U.S. population by the end of the year.
But questions remain about the technology the company is using to guide its vehicles. Experts said that Tesla’s FSD software is a black-box AI model in which errors can’t be easily explained even by its creators and that Tesla engineers most likely hadn’t included enough railroad crossing examples in the videos they used to train the FSD software.
Tesla and Musk didn’t respond to requests for comment on the railroad crossing complaints. Musk doesn’t appear to have spoken about the complaints, but he has said Tesla is planning a major update to the FSD software as soon as late September.
Drivers described a range of malfunctions. Frigoli and other drivers said their vehicles didn’t recognize flashing lights or lowering gate arms. Some said their Teslas haven’t slowed down even when there are trains in front of them. Other drivers said their cars have stopped on top of railroad tracks when there are red lights ahead, which could pose a danger if the arms come down before the lights turn green. In one video posted online, a Tesla initially stops at a crossing, but then, after the gate arms begin to lower, a set of traffic lights farther down the road turns green and the car tries to proceed through the arms seconds before a train arrives. Still other drivers reported that their cars turned onto the tracks themselves.
Experts warn that Tesla and Musk are courting disaster.
“If it’s having trouble stopping at rail crossings, it’s an accident waiting to happen,” said Phil Koopman, an associate professor emeritus of engineering at Carnegie Mellon University.
“It’s just a matter of which driver gets caught at the wrong time,” he said.
Tesla FSD doesn’t mishandle every railroad crossing every time, and some drivers have posted videos online to celebrate successful instances. But an error can lead to catastrophic effects. One of the six drivers who spoke to NBC News said his vehicle handled a rail crossing appropriately in August after it failed to do so earlier this year; he said the more recent experience led him to think Tesla may have resolved the problem, but experts said there’s no way to be sure.
In June, a Tesla in FSD mode drove itself onto a set of train tracks in eastern Pennsylvania and was hit minutes later by a Norfolk Southern freight train, according to local authorities who spoke with the driver. That driver was lucky: The train struck only a glancing blow to the car’s side, and the driver and his passengers had exited the car before the crash.
“They said when they got to the tracks, the car just turned left,” said Western Berks Fire Commissioner Jared Renshaw, who interviewed the driver. “The car was in self-driving mode, and it just turned left.” The incident received some news coverage at the time, but Tesla hasn’t addressed it, and the driver’s identity hasn’t been made public.
Frigoli said that if any Tesla should handle rail crossings well, it should have been his. He drives a 2025 Model Y with the latest Tesla self-driving hardware, known as HW4, and the most recent version of the software, FSD 13.2.9. He also had good driving conditions, with a mostly clear sky and no one on the road ahead of him, as his car approached the train tracks in June.
“I would think with flashing red lights the car should stop on its own,” he said. “In future iterations of FSD, hopefully they’ll code it to work and recognize the railroad crossings correctly.”
The train incidents have tested the faith of some otherwise-satisfied Tesla drivers.
“It’s kind of crazy that it hasn’t been addressed,” said Jared Cleaver, a Tesla owner in Oakland, California.
Cleaver, a project manager in the construction industry, said he was driving his 2021 Tesla Model 3 last fall when he approached a railroad crossing near downtown Oakland. He said he had FSD engaged and was watching the car carefully.
“The car came to a complete stop, and I was like, ‘OK, we’re good.’ And then the car just jumped forward like it was going to go,” he said.
He said he slammed on the brakes. “I don’t know if it would have just jerked and stopped, but I wasn’t going to wait to find out,” he said. He said the same thing happened again this month and that this time he was able to document the mishap on video, which he shared with NBC News.
Cleaver said he loves his car and uses FSD often. He said it sometimes amazes him but also makes dumb mistakes.
“I think it doesn’t perform nearly as well as Elon claims and Tesla claims, but I think it is good,” he said. “They seem to make a habit out of making these really big claims and then falling short. It bothers me.”
“It seems like borderline false advertising,” he said.
Tesla has previously been accused of exaggerating the capabilities of its software, including in a wrongful-death trial this summer over a different piece of Tesla software known as Autopilot. A jury in that case in Miami awarded $243 million to the plaintiff, finding Tesla 33% responsible for a crash. Autopilot is a narrower set of driver-assistance features that includes lane control and blind spot monitoring. Tesla has asked the trial judge to set aside the jury’s verdict or order a new trial.
Full Self-Driving is a package of driver-assistance features that Tesla owners and lease-holders can buy for $99 a month or a one-time fee of $8,000. The software package works with pre-installed hardware, including cameras that capture what’s around the vehicle. Despite the name, the software doesn’t make a Tesla autonomous, and it requires constant human supervision.
FSD has limited appeal. Musk said in July that “half of Tesla owners who could use it haven’t tried it even once,” and a survey of U.S. consumers in August found that only 14% said FSD would make them more likely to buy a Tesla. (Musk didn’t define the phrase “owners who could use it.”)
There are six levels of driving automation, according to a rating system developed by SAE International, a professional association for engineers. Level 0 indicates no automation, and Level 5 represents full self-driving under all conditions without human intervention. Tesla classifies FSD as a “Level 2” system, and it tells drivers in its online manual that FSD requires active human supervision at all times.
NHTSA said in October that it was investigating Tesla FSD’s ability to safely navigate through fog, glaring sun or other “reduced roadway visibility conditions.” Tesla has not provided an update on that investigation, and NHTSA says it is still an open investigation.
Musk, though, has continued to make claims that go beyond what his company says. He recently asserted that, with FSD, Tesla vehicles “can drive themselves,” a claim that experts say isn’t backed up by the evidence.
The rail industry has warned for years about the potential danger of autonomous vehicles. In 2018, the Association of American Railroads, a trade group, told federal regulators in a letter that it was a complicated problem, requiring self-driving cars to recognize “locomotive headlights, horns, and bells,” because not all rail crossings have gates and flashing lights. (Some rail crossings have only white X-shaped signs known as “crossbucks.”)
“Just as the behavior of automated vehicles must be governed as they approach busy roadway intersections, so too must their behavior be governed as they approach highway-rail grade crossings. Rail corridors must be afforded respect,” the association said. An association spokesperson said the rail industry’s position hasn’t changed but didn’t comment on specific incidents. Norfolk Southern declined to comment on the Tesla crash in Pennsylvania.
Last year, 267 people died at railroad crossings, according to the Federal Railroad Administration, which monitors the safety of rail crossings nationwide. It doesn’t track the makes or models of the vehicles involved or whether the vehicles were using autonomous software. The FRA said it was aware of some Tesla incidents but declined to comment.
The issue with Tesla FSD remains despite two examples that got widespread attention: a May 2024 viral video in which a Tesla in Ohio nearly missed colliding with a train and a video from this July in which a Tesla robotaxi test rider said his vehicle failed to see a railroad crossing in Austin.
Joe Tegtmeyer, the robotaxi rider and a Tesla booster, said in a video on X that his vehicle was stopped in an area with a traffic light and a railroad crossing. Then, he said, it began to move at precisely the wrong moment.
“The lights came on for the train to come by, and the arms started coming down, and the robotaxi did not see that,” he said. He said a Tesla employee sitting in the front passenger seat overseeing the robotaxi had to stop the vehicle until the train had passed.
Tegtmeyer declined an interview request. Responding to a question from NBC News, he said in a post on X that he had taken 122 successful rides in Tesla’s robotaxi service and that he thought the service, which began in June in Austin, worked well overall. The Tesla robotaxis use a new version of the FSD software, different from the consumer version, according to Musk.
One Tesla driver said his vehicle didn’t make the same mistake twice. Nathan Brassard of Brunswick, Maine, said his Tesla Cybertruck in FSD mode correctly handled a recent train crossing by braking before the white stop line on the road — an improvement, he said, from an earlier experience when it stopped on the tracks at a red light and he had to take over.
“It’s fun for me to watch it get better,” he said. “I don’t mind when I need to step in. It’s just so much easier than driving, especially on long trips.”
But experts in autonomous technology said there’s no way to be sure whether Tesla’s FSD software is improving because there’s little to no transparency in how it works to outsiders. According to Musk, the latest versions of FSD don’t even have human-written computer code but instead are end-to-end neural networks, based solely on training data rather than specific rules. Musk has compared it to ChatGPT.
Koopman of Carnegie Mellon said Tesla’s choice of training data — the hours of driving video that it feeds into the software to help it learn — is likely to be the root of the rail crossing issue. He said that engineers at Tesla have to choose which videos go into the training data and that it’s not known how many train examples they included.
“The only possible explanation is that it is not sufficiently trained on that scenario,” he said.
He also echoed a complicating factor raised separately by the rail industry: Not all train crossings are the same. Some have white stop lines on the road, but others don’t. Some have flashing lights and gate arms, but many don’t. Lights and gate arms can also malfunction.
Waymo, the market leader in robotaxis and a Tesla competitor, may have taken a different, more cautious approach to rail crossings. Waymo began “rider only” autonomous rides in 2019, and for years afterward, some Waymo customers speculated in social media posts that they believed the cars would route around railroad tracks because of the possible risk.
A Waymo representative said the company’s software considers many factors when it draws up a route and confirmed that rail crossings are one such factor. She added that Waymo vehicles have been regularly crossing heavy rail lines, not just light rail lines, since 2023. She said Waymo uses audio receivers to detect train sounds, and the company has said it has a model rail crossing at a training facility in California.
NBC News didn’t find examples of customers complaining that Waymo vehicles nearly crashed into gates or ignored flashing lights.
“The team at Waymo has invested copiously in developing a safety model and developing a safety culture that seems to be working well,” said Bryan Reimer, a research scientist at the Massachusetts Institute of Technology’s Center for Transportation and Logistics.
Tesla said last year that it planned to begin using audio inputs for better handling of situations involving emergency vehicles, but it’s not clear whether it does so for trains or train crossings. The company didn’t respond to questions about it.
Reimer said he would expect Tesla’s FSD software to be able to handle rail crossings without supervision if Musk is serious that the cars can drive themselves, including as robotaxis.
“You’d think they’d be able to reliably detect this stuff,” he said.