- Free Consultation: (630) 527-4177 Tap Here to Call Us
Waymo Issues Recall After Incidents With School Buses

This week Waymo, Alphabet’s self-driven taxi unit, announced a voluntary software recall after local school districts and federal regulators flagged multiple incidents in which Waymo vehicles reportedly moved past stopped school buses with stop arms and flashing red lights deployed. The incidents, concentrated in the Austin, Texas area, prompted a widening probe by the National Highway Traffic Safety Administration (NHTSA) and public concern about how autonomous vehicles (AVs) detect and respond to vulnerable road users such as schoolchildren.
“Autonomous vehicles promise huge safety gains, but when systems misread legally required safety devices, like a school bus stop arm, the consequences can be catastrophic. Families and communities deserve transparency, rapid fixes, and accountability. When a child is harmed or put at risk, it’s essential to gather the facts, preserve evidence, and seek experienced legal counsel to make sure corporate recalls lead to real, verifiable change.” — John J. Malm, Naperville car accident attorney
What the Recall Covers and Why Waymo Acted
Waymo’s recall is a software-level remedy intended to correct how its automated driving system identifies and reacts to stopped school buses and their deployed safety devices. Waymo said it would issue the recall after the Austin Independent School District documented nearly 20 incidents since the start of the school year in which Waymo vehicles allegedly passed stopped school buses. The company previously pushed an update in November to address an earlier, related issue; however, independent footage and school-district stop-arm cameras showed incidents continued afterward, prompting the voluntary recall and increased federal scrutiny.
Key facts at a glance:
- Austin ISD recorded about 19 separate incidents in which Waymo vehicles passed stopped school buses with stop arms deployed.
- NHTSA opened a preliminary evaluation into Waymo after initial reports and has requested detailed information and timelines from the company.
- Waymo characterized the problem as related to a software behavior that sometimes caused vehicles to initially slow or stop for a bus and then proceed when they should have remained stopped. The voluntary recall applies to a fleet managed and owned by Waymo and will push a software update fleet-wide.
How Serious Were the Incidents?
The recorded incidents include alarming close calls. In some videos provided by Austin ISD, a Waymo vehicle is seen proceeding past a stopped bus just moments after a student crossed in front of the bus, a near-miss that underscores the stakes when AVs misinterpret roadside safety devices or pedestrian movements. School officials say multiple incidents occurred after Waymo’s November software update, which raises questions about how fixes are validated and deployed. NHTSA has set deadlines for Waymo to explain the events and the company’s mitigation steps.
Autonomous Vehicle Safety
Autonomous vehicle issues, software updates, and recalls are not new. Waymo previously applied a large software recall impacting over 1,200 Gen-5 vehicles in mid-2024–2025 to address other operational issues, and the company says it routinely updates its fleet with safety-critical software releases. Still, incidents that involve school buses attract special scrutiny because stopped buses are a high-risk scenario for children and because state laws (including in Illinois and Texas) require all lanes to stop for buses when stop arms are extended. Regulatory scrutiny tends to intensify when repeated events suggest a systemic behavior rather than isolated edge cases.
Why School Buses Present a Challenging Environment for Self-Driving Vehicles
School bus interactions are complex for both human drivers and automated systems:
- Buses deploy mechanical stop arms that extend into traffic and activate flashing red lights, signals intended to communicate a legal obligation to stop, but which vary in exact placement and timing.
- Children often disembark and cross directly in front of buses, creating unpredictable pedestrian trajectories and short reaction windows.
- Buses can stop in different lane configurations (e.g., multiple-lane roads, center turn lanes), and roadside environment (parked cars, trees, signage) can obscure visibility.
- AV perception systems must fuse camera, lidar, radar, and map data, and a subtle sensor misclassification or logic rule can cause the system to misjudge the legal or safe response.
These technical and situational complexities mean rigorous testing around school zones is essential before full deployment, and they explain why regulators and school districts react strongly when AV behavior appears to endanger children.
Safety Tradeoffs and the Importance of Transparency
Automatic vehicle companies emphasize safety-by-design and say they continually update software to address edge cases discovered in deployment. Yet the school bus incidents highlight a central tension:
- AVs can reduce many types of human error (distraction, impairment), but software mistakes or misinterpretations of legal devices can create new kinds of risk.
- AV operators often own and manage their fleets, which lets them push updates centrally, a safety advantage, but it also places a high burden on rigorous validation and transparent reporting when problems arise.
- Local authorities and school districts rightly demand transparency about incidents that affect children and expect immediate corrective action, including temporary operational limits near schools when problems cannot be quickly fixed.
Legal and Insurance Implications After an Accident With an Autonomous Vehicle
Accidents involving autonomous vehicles can raise complex liability and evidence issues:

- Who is the “driver”? When a vehicle is operating under automated control, liability can involve the vehicle operator (the AV company), manufacturers, the vehicle owner, or traditional drivers if one was present and tasked with monitoring.
- Data is central. EDR/black-box recordings, sensor logs, map files, and software version histories are key pieces of evidence and are often controlled by the AV operator. Preserving and accessing those records quickly is essential.
- Recall and notice. A voluntary recall may indicate an acknowledgement of a safety defect. That admission can be relevant in civil claims, though the legal effects depend on the jurisdiction and specifics of the recall.
- Insurance coverage. Insurers and self-insured operators may dispute causation and allocation of fault; specialized representation helps victims navigate technical evidence and negotiate for fair recovery.
If you or a loved one was injured in an interaction with a self-driving vehicle, especially in or around a school zone, it’s important to preserve evidence and consult counsel experienced with AV litigation and complex product liability claims.
Frequently Asked Questions about the Waymo Recall
Q: Did Waymo admit the vehicles were at fault?
A: Waymo acknowledged a software behavior issue and announced a voluntary software recall to push a fix fleet-wide. A recall is a corrective action but does not, by itself, constitute an admission of legal fault; however, it is compelling evidence that the company recognized a safety problem requiring correction.
Q: How many incidents were recorded?
A: Austin Independent School District documented about 19 incidents in which Waymo vehicles allegedly passed stopped school buses since the start of the school year. Some incidents were recorded after Waymo said it had already deployed a software fix in November.
Q: Is NHTSA investigating?
A: Yes. NHTSA launched a preliminary evaluation after earlier reports and has requested detailed information and timelines from Waymo. The agency’s involvement typically precedes more formal action if it finds systemic safety concerns.
Q: Could this lead to prosecutions or fines?
A: Possibly. If investigations uncover willful non-compliance with safety laws or if local statutes (for example, state laws requiring stopping for school buses) were violated and caused harm, regulatory penalties or civil liability may follow. Criminal prosecution is uncommon but could be considered in egregious cases involving deliberate wrongdoing or gross negligence.
Q: What should I do if my child was struck or nearly struck in such an incident?
A: Seek immediate medical care for the child, document the scene and any available video, report the incident to local police and the school district, preserve any communications from the AV operator, and contact an attorney experienced in vehicle collisions and product liability. Early preservation of sensor and EDR data is critical.
Contact the Dedicated Illinois Car Accident Injury Lawyers at John J. Malm & Associates
If you or a loved one has been injured in an accident involving a partially automated or any other self-driving car, the top-notch Illinois car accident attorneys at John J. Malm & Associates can help investigate the incident, preserve crucial electronic evidence, and pursue compensation for medical care, lost wages, pain and suffering, and other damages.
Contact our office for a free, confidential consultation. We are prepared to help families navigate these technically complex cases, coordinate with experts, and pursue whatever remedies are available to hold responsible parties accountable. Let us help you get the justice you deserve.















