Last month, after weeks of pressure from the Austin Independent School District, Waymo issued a voluntary recall of its autonomous vehicles due to a software issue that it said it had already patched.
Weeks later, the issue still has not been fixed, as new video has emerged of Waymo robotaxis putting students in danger.
Waymo quick facts:
- Waymo One available 24/7 to customers in Los Angeles, Phoenix, and the San Francisco Bay Area, as of July 2025
- Founded in 2009
- Passed first U.S. state self-driving test in Las Vegas, Nevada, in 2012 (Source:IEEE Spectrum)
- Spun out from Alphabet as separate subsidiary in 2016
In November, the Austin Independent School District publicized videos of the company’s robotaxis driving past Austin school buses with their stop signs and crossing bars deployed.
Waymo robotaxis were committing school bus traffic violations an average of 1.5 times per week in Austin, Texas, from the start of the school year to November 20.
Austin ISD stated that it had been in contact with Waymo for weeks regarding the issue, even going as far as to request the company halt operations between 5:20 a.m. and 9:30 a.m. and from 3 to 7 p.m. until it actually fixed the issue.
The school district stated that the company had assured them that the software update to address the issue had already been implemented.
Waymo cars were a hazard on the road during a recent blackout in San Francisco.
Photo by Anadolu on Getty Images
Waymo safety issue stretches into December
On Dec. 1, after Waymo received its 20th citation from Austin ISD for the current school year, Austin ISD decided to release the video of the previous infractions to the public.
On Dec. 5, Waymo announced that it will file for a voluntary recall “early next week” to address the issue.
At the time, the company said it had identified the issue that caused the violations. The company also said it believes the software updates it implemented by November 17 “have meaningfully improved performance to a level better than human drivers in this important area.”
Related: Waymo exec admits harsh truth about company’s safety record
But this wasn’t the first time that Waymo has faced scrutiny over this very issue.
The NHTSA opened a Preliminary Evaluation in October to investigate an estimated 2,000 Waymo 5th-gen automated driving system-equipped vehicles, following a Georgia media report that revealed the same school bus violation.
The agency opened another investigation following the Austin ISD’s actions.
“ODI is concerned that ADS-equipped vehicles exhibiting such unexpected driving behaviors or not complying with traffic safety laws concerning school buses may increase the risk of crash, injury, and property damage,” NHTSA officials said.
Waymo safety issues stretch into January
Despite multiple supposed software fixes, multiple NHTSA investigations, and a recall, Waymo vehicles are still passing Austin ISD school buses with their stop signs deployed, putting children in danger.
Austin ISD issued a citation to Waymo, saying that four more violations have occurred since December 10, maintaining the company’s 1.5-violation-per-week streak.
Local news station KXAN has obtained and viewed videos showing at least 24 violations where the school bus cameras capture Waymo vehicles illegally passing the school buses.
TheStreet has not viewed the December violations, but KXAN describes one of the videos.
Once again, Waymo says its software update is working.
“We have met with Austin ISD, including on a collaborative data collection of various light patterns and conditions and are reviewing these learnings. We have seen material improvement in our performance since our software update,” a Waymo spokesperson told KXAN.
While Waymo’s 24 violations pale in comparison to the more than 7,000 violations from human drivers, 98% of the people who have received one violation do not receive another, according to Austin Assistant Chief Travis Pickford.
“That tells us that the person is learning but it does not appear the Waymo automated driver system is learning through its software updates, its recall, what have you, because we are still having violations all the way up until last Monday,” Pickford told KXAN.
Waymo safety record isn’t what it seems
While the Austin ISD incident was the most high-profile, it wasn’t Waymo’s only big mistake in December.
The week before Christmas, Waymo was forced to suspend service in San Francisco, as apparently its vehicles did not know the “four-way-stop” rule that applies to intersections with inoperable traffic lights.
A massive blackout in the city, with more than 800,000 residents, left Waymo vehicles very confused.
The vehicles were filmed stuck at numerous intersections, unsure of how to navigate the situation, causing even more turmoil on the roads as drivers slowly inch past electricity-less city blocks.
Related: Waymo is back online in San Francisco, but may struggle after failure
After consistently declining for 30 years, roadway fatalities in the U.S. have risen over the past decade.
Autonomous vehicles are supposed to help solve the problem of accidents and roadway fatalities.
“Waymo is already improving road safety in the cities where we operate, achieving more than a tenfold reduction in serious injury or worse crashes,” Trent Victor, Waymo’s director of safety research and best practices, recently told Bloomberg.
But the data suggest a more complicated reality.
Waymo has driven approximately 127 million miles across its fleet and has been involved in at least two crashes with fatalities, according to recent coverage by MSN. However, the autonomous vehicle was not directly found responsible for either of them.
The problem is that this actually represents a higher death-per-mile rate than that of average American drivers, who travel about 123 million miles for every fatality, per Insurance Institute for Highway Safety.
But even that statistic is not relevant, according to Austin-based transportation attorney Tray Gober.
“What matters is not the accident rate per million miles driven, but instances like passing a school bus that’s stopped or driving in inclement weather. Human drivers encounter edge cases all the time and they have to be prepared,” Gober told TheStreet.
“Driverless rideshare companies are deploying vehicles that aren’t ready to avoid basic hazards and making the public be guinea pigs for corporations trying to get greater market share. It seems they are making a calculated risk that maybe they hit a kid but they’re developing the technology and getting greater market share, and that’s just a cost of doing business for them,” he said.
Related: Tesla hits huge Robotaxi milestone, but questions in Austin remain