News: 1748644703

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Tesla FSD ignores school bus lights and hits 'child' dummy in staged demo

(2025/05/31)


Video Tesla has been testing self-driving Model Ys on the streets of Austin, Texas. But according to the automaker's bête noire, the Dawn Project, kids should keep clear.

The self-proclaimed self-driving gadfly [1]posted a video this week showing a staged event in which a Tesla running the latest version of its self-driving software illegally overtook a school bus with its red lights flashing (indicating it is dropping off passengers) and then slammed into a child-sized mannequin emerging from the sidewalk. The car identified the pedestrian, but didn't stop after hitting it and carried on down the road.

To be fair to Tesla, a human driver would have had problems stopping in time to avoid the dummy, although one would hope that a human driver would have noticed the lights and stopped as legally required. However, the software running the car - FSD (Supervised) 13.2.9 - did not. (FSD originally for "full self-driving," but as the parenthetical now denotes, Tesla's own documentation has always said that the driver should be supervising at all times.)

[2]

The Dawn Project has its own agenda. It was founded by software entrepreneur Dan O'Dowd to highlight the dangers of Tesla's self-driving claims when matched with reality. O'Dowd, who owns several Teslas himself - including original Roadster models - has been sounding concerns about the software for years.

[3]

[4]

"It's been two and a half or three years since we pointed this out in a Super Bowl advert," O'Dowd told The Register . "They just don't care. They have other priorities, like a robotaxi working on Austin streets, that is a priority. Elon sets priorities, and he's never made safety a priority."

[5]Youtube Video

[6]

The staged demo was not the first time Tesla has had problems with school bus lights. In 2023, the US National Highway Traffic Safety Administration [7]reportedly investigated an incident in which student Tillman Mitchell was struck by a Tesla Model Y after exiting a stopped school bus with its red warning lights flashing. The driver was allegedly using Tesla's earlier driver-assist system, Autopilot, and had affixed weights to the steering wheel to fool the hands-on detection, according to The Washington Post.

Musk [8]has said that his car business is already testing driverless Model Y cars in Austin "a month ahead of schedule." That said, his deadlines tend to be fairly flexible - he's aiming to land an uncrewed Starship on Mars by the end of 2026, but so far the rocket continues to [9]fail key test flights.

[10]Self-driving car maker Musk's DOGE rocks up at self-driving car watchdog, cuts staff

[11]Tesla Cybertruck recall #8: Exterior trim peels itself off, again

[12]Tesla fudged odometer to screw me out of warranty, Model Y owner claims

[13]UK-based self-driving car startup Wayve heads to Japan for more driving data

There is already a self-driving taxi service operating in Austin using Google offshoot Waymo, and it's been operating a similar service in San Francisco for over a year. The difference, according to O'Dowd, is that Waymo vehicles use expensive Light Detection and Ranging (LiDAR) sensors, while Tesla eschews this in favor of cheaper vision-based components and relies on software to fill in the gaps.

Self-driving tech is difficult - just ask Cruise, the GM-backed robocar business that burned through billions trying to make the technology work before [14]throwing in the towel last year after a well-publicized incident in which a Cruise car dragged a pedestrian who had been knocked down by another vehicle.

"There's a million people who die in car accidents every year," said O'Dowd. "And if we get everybody on quality software that's better than human drivers, like Waymo, we will save hundreds of thousands of lives per year. That's absolutely true. But it won't be Elon. It won't be Tesla."

[15]

Tesla had no comment at the time of going to press. ®

Get our [16]Tech Resources



[1] https://x.com/RealDawnProject/status/1927556338772496514

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aDp-1Z7sa6JUvdGChK3akwAAAFg&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aDp-1Z7sa6JUvdGChK3akwAAAFg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aDp-1Z7sa6JUvdGChK3akwAAAFg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[5] https://www.youtube.com/watch?v=_ZiSZbWIrzA

[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aDp-1Z7sa6JUvdGChK3akwAAAFg&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[7] https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/

[8] https://x.com/elonmusk/status/1927970940874354941

[9] https://www.theregister.com/2025/05/28/starship_flight_9_crash/

[10] https://www.theregister.com/2025/04/11/doge_nhtsa_audit/

[11] https://www.theregister.com/2025/03/20/tesla_cybertruck_recall/

[12] https://www.theregister.com/2025/04/17/tesla_faked_odometer_data/

[13] https://www.theregister.com/2025/04/22/wayve_heads_to_japan_for/

[14] https://www.theregister.com/2024/12/11/cruise_gm_shutdown/

[15] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/front&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aDp-1Z7sa6JUvdGChK3akwAAAFg&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[16] https://whitepapers.theregister.com/



Ace2

My Tesla does not know what the speed limit is. It cannot understand school zones or variable-speed construction zones. FSD should be banned.

tfewster

That's weird, because my 8 year old cheapo Kia is quite good at recognising speed signs, and warning the pilot (me).

cornetman

> "The car identified the pedestrian, but didn't stop after hitting it and carried on down the road."

The article does talk about the use of cameras as opposed to LIDAR as a factor but it also says that the pedestrian was correctly identified, but mown down regardless.

I wonder how many actual incidents are caused due to the use of cameras rather than software glitches.

cornetman

> "And if we get everybody on quality software that's better than human drivers, like Waymo, we will save hundreds of thousands of lives per year. That's absolutely true. But it won't be Elon. It won't be Tesla."

In fairness, the self driving features are already statistically better than average humans at regular driving tasks so if all cars were using Tesla's tech then lives would already be saved. That's partly because a lot of human drivers are pretty inattentive and sometimes reckless.

The big problem is that it fails in incomprehensible and strange ways that makes us distrust it, and rightly so. I live on an island here where we also have the yellow school buses around in the morning and late afternoon. Meatbag drivers also sometimes do not stop when it is flashing its lights and has the STOP signs out. Tesla's self driving systems might be flawed but people are similarly stupid.

For those outside the Americas that are maybe not aware, traffic must stop *on both sides of the road*, and not pass the bus, when the it shows the red lights and sign so that kids can safely cross the road.

Statistics

Anonymous Coward

Yeah, statistically Tesla FSD is already safer than human drivers, but that's mostly because they cherry-pick the data. They're looking at overall accident rates for people vs. FSD which is overwhelmingly highway driving; their safety record wouldn't look so good compared to highway-only numbers for humans.

Also worth calling attention to one regulator's demand for statistics about how many accidents happen within a few seconds of autopilot automatically disengaging because it's realized it has no idea what it's doing. I own one of these and can tell you it will sometimes do that with zero warning, so if you weren't following directions and keeping your hands on the wheel, you'll probably get into an accident. That isn't currently counted against FSD for the safety stats.

It is problematic.

Tron

You don't just have to manage the tech, you have to manage expectations and the media.

Thousands die in road accidents, but one accident with a driverless car could end it as a technology. Which is daft. We shouldn't have an expectation of perfection. We need to do the sums and be happy if the dead body count is lower. Ultimately, the tech is a success if it kills fewer people, not a failure if it kills anyone.

I think that driverless may be a tech step too far given the amount of variables involved and the public/media response to any death. It may be better to have tech assists to reduce accidents. Controlled driverless zones on motorways might then be implemented to give drivers a break.

Let’s be honest

Anonymous Coward

Tesla FSD is just a (dangerous) gimmick.

Re: Let’s be honest

Anonymous Coward

It's just Americans killing Americans trying very hard to equal their 1860s bloodfest in a gruesome version of manifest destiny.

Being America I can foresee the whole thing coming undone in the courts when FSD shows a preference in its victims for a particular racial or ethnic group. Adds a whole new meaning to D.I.E.

"Well, if you can't believe what you read in a comic book, what *can*
you believe?!"
-- Bullwinkle J. Moose