The National Highway Traffic Safety Administration said Thursday that it would investigate how Tesla’s semi-autonomous driving software handles railroad crossings as part of a wide-ranging investigation of incidents where the agency said Tesla vehicles have violated traffic safety laws.
NHTSA said it was launching the investigation of Tesla’s Full Self-Driving software after receiving complaints from drivers, including reports of vehicles driving through red lights or on the wrong side of the road. The agency said the investigation relates to all Tesla vehicles equipped with FSD, or 2.9 million vehicles.
The agency said its investigation would look at the software’s performance at railroad crossings, which was the subject of an NBC News investigation published in September.
“While the behaviors under investigation appear to occur most frequently at intersections, NHTSA’s investigation will encompass any other types of situations in which this behavior may arise, such as when traveling adjacent to a lane of opposing traffic or when approaching railroad crossings,” NHTSA said in a three-page summary.
Tesla and its CEO, Elon Musk, did not immediately respond to requests for comment Thursday.

NBC News’ investigation reported that Tesla vehicles using the FSD software sometimes fail to stop for train tracks or otherwise mishandle situations at rail crossings, including when red lights are flashing and gate arms are lowering, according to Tesla drivers and videos they’ve taken. Based on the report, two senators had called on NHTSA to begin an investigation.
In its announcement Thursday, NHTSA said it would look into whether the system gives drivers enough time to respond to FSD errors. It said that in some reported incidents, a Tesla vehicle provided “little notice to a driver or opportunity to intervene.”
NHTSA said it had identified 18 complaints and one media report alleging that a Tesla vehicle in FSD mode “failed to remain stopped for the duration of a red traffic signal, failed to stop fully, or failed to accurately detect and display the correct traffic signal state in the vehicle interface.” The agency said that, in six of the cases, the Tesla crashed with another vehicle at an intersection and that, in four cases, those crashes resulted in one or more reported injuries.
The agency said that multiple incidents occurred at one intersection in Maryland. It said it “understands that Tesla has since taken action to address the issue at this intersection.”
NHTSA said its investigation was preliminary. Possible outcomes range from no action by the agency to an update or recall of the FSD software.
Tesla has warned drivers, including in the online driver’s manual, that the FSD software does not make its vehicles fully autonomous. Tesla tells drivers to oversee the vehicle’s operation at all times, and it has sometimes added the phrase “(Supervised)” to the product name, emphasizing its view that drivers, not the software, are responsible for a vehicle’s operation.
Musk, though, has made claims that go beyond what Tesla says. In August, he said in a post on X that, with FSD, Tesla vehicles “can drive themselves,” a claim that experts say isn’t backed up by the evidence. He’s also made FSD a centerpiece of Tesla’s future, including plans for a fleet of driverless robotaxis.
Full Self-Driving is a package of driver-assistance features that Tesla owners and lease-holders can buy for $99 a month or a one-time fee of $8,000. It works with pre-installed hardware, including cameras that capture what’s around the vehicle.
Earlier this week, Tesla released an update of FSD called version 14. It’s not immediately clear how the new version handles situations described by NHTSA in its investigation announcement.