News: 0180748664

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Waymo is Having a Hard Time Stopping For School Buses (theverge.com)

(Saturday February 07, 2026 @05:01AM (msmash) from the patch-didn't-patch-it dept.)


Waymo's robotaxis have [1]racked up at least 24 safety violations involving school buses in Austin since the start of the 2025 school year, and a voluntary software recall the company issued in December after a federal investigation has not fixed the problem.

Austin Independent School District initially reported at least 19 incidents of Waymo vehicles failing to stop for buses during loading and unloading -- illegal in all 50 states -- prompting NHTSA to open a probe. At least four more violations have occurred since the software update, including a January 19th incident where a robotaxi drove past a bus as children waited to cross the street and the stop arm was extended.

Waymo also acknowledged that one of its vehicles [2]struck a child outside a Santa Monica elementary school on January 23rd, causing minor injuries. Austin ISD has asked Waymo to stop operating near schools during bus hours until the issue is resolved. Waymo refused. Three federal investigations have been opened in three months.



[1] https://www.theverge.com/transportation/874385/waymo-school-bus-austin-safety-robotaxi

[2] https://tech.slashdot.org/story/26/01/29/151223/waymo-robotaxi-hits-a-child-near-an-elementary-school-in-santa-monica



Kind of weird (Score:2)

by phantomfive ( 622387 )

I would expect "stopping for school busses" would be an obvious and easy situation.

Re: Kind of weird (Score:5, Insightful)

by Mr. Dollar Ton ( 5495648 )

it is obvious if you understand the concept of driving instead of mimicking it statistically with some probability.

a simple difference that the "AI" proponents and "investors" can't seem to grasp and acknowledge.

Re: (Score:2)

by phantomfive ( 622387 )

Someone should have (and I'm sure they did) thought, "we need to train this to handle school-buses and cone zones" then including those scenarios in the training data.

There are definitely side cases that are difficult to predict for self-driving vehicles; this isn't one of them.

Re: Kind of weird (Score:5, Insightful)

by Mr. Dollar Ton ( 5495648 )

I'm sure someone though of it. What's obvious from the failures is that model training isn't a substitute for understanding, which the model is lacking. So it will always have a nonzero chance to fuck up an obvious situation, which is what we mostly deal with.

Of course you'll have people arguing it isn't different with people on the account of the outcome (people are slower, get tired, etc) but the fundamental difference is the understanding, and the model doesn't have it.

Hence Agrdaaeelbal instead of America.

Re: (Score:2)

by phantomfive ( 622387 )

> What's obvious from the failures is that model training isn't a substitute for understanding, which the model is lacking.

Even if the model doesn't have understanding, it can be trained to stop for school buses.

Re: Kind of weird (Score:1)

by Mr. Dollar Ton ( 5495648 )

No, you cannot expect determinism from a fully probabilistic model. Someone must hardcode the laws of robotics in those positronic brainz, and we know that even that won't work.

Re: (Score:1)

by phantomfive ( 622387 )

If it's on a computer, it's fully deterministic (unless someone installed a hardware RNG).

Re: Kind of weird (Score:2)

by Mr. Dollar Ton ( 5495648 )

the model inputs (road conditions, perceived bus color, etc) are always random.

Re: (Score:2)

by Mr. Dollar Ton ( 5495648 )

Yes, they could, but at some point it will get way more expensive than a driver just because of all that hardware, while the original idea was, I presume, to replace the drivers for less. No idea how much is invested up to today and how the costs compare to the results achieved. I would guess they are still way more expensive than a driver.

Re: (Score:2)

by TuringTest ( 533084 )

> If it's on a computer, it's fully deterministic (unless someone installed a hardware RNG).

If you believe that computers are fully deterministic, I have a PC I want to sell you...

Theoretical computers may be fully deterministic, but physical computational machines are made of electrical signals running on rare earth semiconductors, and with AI we have complex statistical chaotic interactions on top.

Any small unpredictable perturbation at any later may swing the whole system in a whole new direction. Hardly what we'd call deterministic (unless you believe the whole universe is deterministic, in wh

Re: (Score:2)

by aRTeeNLCH ( 6256058 )

Even though your post is rightfully modded up to 5, the fact of the matter is that many people also lack understanding, and the world would be a safer place if the human factor were taken out of driving entirely.

I do disagree that there's a larger non zero chance to duck up an obvious situation for a machine than for a human driver. Perhaps the chance is larger for non obvious cases, but that will get fixed.

Also, you seem to assume that self driving is largely straight out of a model, like LLMs hallucin

Re: (Score:2)

by Mr. Dollar Ton ( 5495648 )

> fact of the matter is that many people also lack understanding,

Yes, and many among those who do understand, ignore the rules. Hence there is responsibility to face if one is guilty of such behavior.

> I do disagree that there's a larger non zero chance to duck up an obvious situation for a machine than for a human driver.

Which isn't something I'm saying above. It is a mixture of factors, understanding the rules, however, is a cause for most of these "uncanny" problems.

> Also, you seem to assume that self driving is largely straight out of a model,

Apparently not, it seems that it may just be a case of Filippino drivers simply not knowing the US driving rules in detail :)

[1]https://www.newsweek.com/waymo... [newsweek.com]

[1] https://www.newsweek.com/waymo-reveals-remote-workers-in-philippines-help-guide-its-driverless-cars-11478439

Re: (Score:2)

by CaptQuark ( 2706165 )

Actually, it is not exactly the same in all 50 states for multi-lane roads. In the linked article it states that the Waymo vehicle was filmed breezing through the opposite lane of traffic. The laws vary on opposing traffic depending on the state.

For a multi-lane road with only a turn lane separating the opposing traffic, Texas law requires opposing traffic to stop. [1]Texas school bus laws [liggettlawgroup.com]

But for Washington state, Missouri, South Carolina, and a few others, a turn lane is enough separation to allow opposing

[1] https://liggettlawgroup.com/blog/car-accidents/school-bus-laws-in-texas-do-you-know-when-to-stop-or-pass/

Re: (Score:1)

by phantomfive ( 622387 )

I expect Waymo to follow basic driving laws. That's not an edge case.

Re: (Score:2)

by martin-boundary ( 547041 )

The robotaxis are a single driver .

I would expect if a single driver racked up 25 safety violations involving school buses that their drivers license should be suspended.

Re: (Score:2)

by OpenSourced ( 323149 )

Well, it is if you think of the driving software as a deterministic machine, as we are used to. If you have a toy truck, you can make it go where you want, but try that with a cat. The driving software is quite nearer the cat situation than the toy truck.

context (Score:1)

by phantomfive ( 622387 )

> Austin ISD has asked Waymo to stop operating near schools during bus hours until the issue is resolved. Waymo refused.

I would like to see the context behind why Waymo refused this request, prima facie it seems like a reasonable request.

Re: (Score:2)

by 93 Escort Wagon ( 326346 )

I'd say it's time to get the lawyers involved.

Re: context (Score:2)

by Mr. Dollar Ton ( 5495648 )

They obviously need more statistics to retrain the failing models. Knocking down a few kids in the process isn't a large cost, the investors will cover the damages.

evolution (Score:2)

by ishmaelflood ( 643277 )

By killing the slow unobservant kids Waymo is improving the human race, ready for the war against AI.

Re: evolution (Score:2)

by Mr. Dollar Ton ( 5495648 )

yeah, we'll have to evolve to deal with the "AI" bullshit. I guess we're too dumb to do it from first principles.

Re: context (Score:2)

by commodore73 ( 967172 )

I'm curious if Waymo's refusal could impact their likely future liability, for example from something like negligent manslaughter to a more serious charge. And then we get to the question of who is actually liable when the machines they control damage things.

All 50 states... but differently (Score:2)

by dgatwood ( 11270 )

> ...failing to stop for buses during loading and unloading -- illegal in all 50 states

Well, maybe illegal or not, depending on the circumstances and depending on the state:

Not necessary to stop if the bus is in a loading zone that is completely off the road if crossing the road is not permitted.

Not necessary to stop in the opposite direction if the road has 4 or more lanes.

Not necessary to stop in the opposite direction if the road has a barrier between travel directions.

Not necessary to stop in the opposite direction if the road has a median.

Not necessary to stop in the opposite direction

Re: (Score:2)

by rta ( 559125 )

I've always found these rules poorly calibrated and overly conservative.

basically based on moral panic about kids rather than logic about traffic laws and sharing the road fairly.

but there are no school buses where I currently live, so haven't been annoyed by it for a while.

(tbh idk WHY there aren't buses or how kids get to school here. )

Re: (Score:1)

by phantomfive ( 622387 )

If Waymo is operating in a state, they can follow the laws of the state. That's the easy part of self-driving.

Re: (Score:2)

by phantomfive ( 622387 )

> Our traffic laws really do need to be standardized at the national level

That will be great, then we can have supreme court judge manipulation and national elections based on whether right turn on red should be legal or not. Give me passing on the right or give me death! Make America great again, get rid of the roundabouts! Speed up to get through yellow lights is change we can believe in!

Re: (Score:2)

by dgatwood ( 11270 )

>> Our traffic laws really do need to be standardized at the national level

> That will be great, then we can have supreme court judge manipulation and national elections based on whether right turn on red should be legal or not. Give me passing on the right or give me death! Make America great again, get rid of the roundabouts! Speed up to get through yellow lights is change we can believe in!

There's something pretty messed up about the ignorance of the law being no excuse when the U.S. legal system is such a nightmarishly complex mess. For the most part, we all pretty much assume that if we're not doing something obviously wrong, we'll be okay, and that's usually roughly good enough, but traffic is a big exception.

Whether right turns on red are allowed or banned (and whether signs are posted saying so), whether u-turns are allowed or denied by default, whether lane splitting by motorcycles is

Re: (Score:2)

by phantomfive ( 622387 )

Wait until you find out that cities can make their own traffic laws, too.

Re: (Score:2)

by NotEmmanuelGoldstein ( 6423622 )

Roundabouts were an American invention to make vehicle movement consistent (IE: All vehicles moving in the same direction.) and thus safer. The British give-way rule, made all intersections safer again. Roundabouts are used where there are a large number of exits, or vehicle density is very uneven over a work-day, and thus vehicles would be unnecessarily stopped by traffic lights, during most of the day.

Re: (Score:2)

by phantomfive ( 622387 )

Oh, you did it now. Now I'm reaching for the big rhetorical guns. If you like roundabouts, you are literally Hitler! Remember, you made me do it.

Not just Austin (Score:3)

by sjames ( 1099 )

Same problem in the Atlanta area.

Also an incident where a Waymo got confused on the interstate, which it is forbidden to even be on.

Sounds like a payout waiting... (Score:2)

by Archfeld ( 6757 )

I can't believe someone hasn't thrown themselves off the fender or hood of a waymo and sued. I can't see a judge or jury finding in Waymo's favor...

WEeelll here's your problem- (Score:2)

by locater16 ( 2326718 )

-the child mode was set to "kill", see you just set it to "do not kill" and you're all good here now.

I would suspect (Score:2)

by Randseed ( 132501 )

I would suspect that the Waymo would just weave in and out of traffic while accelerating and laying on the horn with a loudspeaker screaming "Move over and get off my lawn!"

Newlan's Truism:
An "acceptable" level of unemployment means that the
government economist to whom it is acceptable still has a job.