Copyright Los Angeles Times

Around 2,000 Waymo vehicles are under investigation following reports of one of the self-driving taxis navigating around a school bus while children disembarked. The National Highway Traffic Safety Administration opened a preliminary investigation on Oct. 17 to examine the performance of Waymo’s autonomous technology around stopped school buses and how the system is designed to comply with school bus traffic safety laws, according to the investigation filing. The incident occurred on Sept. 22 in Atlanta, Georgia, when a Waymo using the Mountain View, Calif. company’s fifth generation automated driving system approached the right side of a stopped school bus. The vehicle initially stopped, but then drove around the front of the bus and past the left side, the filing said. The vehicle ignored the bus’s extended stop arm on the left side and its traffic crossing control arm on the right side, near where students were unloading. The bus was also flashing a red light. Prior similar incidents have likely occurred, NHTSA said. “Safety is our top priority, as we provide hundreds of thousands of fully autonomous paid trips every week in some of the most challenging driving environments,” a Waymo spokesperson said. “NHTSA plays a vital role in road safety, and we will continue to work collaboratively with the agency as part of our mission to be the world’s most trusted driver.” Waymo has already made software updates to improve its self-driving performance, and has plans for additional updates, the spokesperson said. The school bus was partially blocking a driveway that the Waymo was exiting from during the incident, the company said, and the vehicle kept a safe distance from children. According to Waymo, the driverless taxis are improving road safety conditions in the communities they operate in, achieving a fivefold reduction in injury-related crashes compared to human drivers. The company’s operations have seen glitches and recalls, however. Last month, police officers in San Bruno, CA, observed a self-driving Waymo make an illegal U-turn at a traffic light. Officers could not issue a ticket because there was no human driver present, the San Bruno Police Department said. Instead, the department contacted the company to let them know about the violation. When self-driving cars violate the rules of the road, law enforcement can’t penalize them the same way that they can humans. The way state law has been interpreted, traffic tickets can be issued only to an actual driver. California lawmakers have sought to close the enforcement loophole with legislation that will take effect in July, but critics say the law isn’t strong enough. Waymo spokesperson Julia Ilina said that the company’s vehicles are already subject to close, ongoing oversight by California regulators, and that the company’s autonomous driving system “is designed to respect the rules of the road.”