News: 0176892065

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Xiaomi EV Involved in First Fatal Autopilot Crash (yahoo.com)

(Tuesday April 01, 2025 @11:30AM (BeauHD) from the software-fix-rolling-out dept.)


An anonymous reader quotes a report from Reuters:

> China's Xiaomi said on Tuesday that it was actively cooperating with police after a [1]fatal accident involving a SU7 electric vehicle on March 29 and that it had handed over driving and system data. The incident marks the first major accident involving the SU7 sedan, which Xiaomi launched in March last year and since December has outsold Tesla's Model 3 on a monthly basis. Xiaomi's shares, which had risen by 34.8% year to date, closed down 5.5% on Wednesday, underperforming a 0.2% gain in the Hang Seng Tech index. Xiaomi did not disclose the number of casualties but said initial information showed the car was in the Navigate on Autopilot intelligent-assisted driving mode before the accident and was moving at 116 kph (72 mph).

>

> A driver inside the car took over and tried to slow it down but then collided with a cement pole at a speed of 97 kph, Xiaomi said. The accident in Tongling in the eastern Chinese province of Anhui killed the driver and two passengers, Chinese financial publication Caixin reported on Tuesday citing friends of the victims. In a rundown of the data submitted to local police posted on a Weibo account of the company, Xiaomi said NOA issued a risk warning of obstacles ahead and its subsequent immediate takeover only happened seconds before the collision. Local media reported that the car caught fire after the collision. Xiaomi did not mention the fire in the statement.

The report notes that the car was a "so-called standard version of the SU7, which has the less-advanced smart driving technology without LiDAR."



[1] https://www.yahoo.com/news/chinas-xiaomi-says-actively-cooperating-052114932.html



Refreshing (Score:3)

by AmiMoJo ( 196126 )

It's refreshing for a company to be this open and honest about what happened. When Tesla autopilot kills someone they will only ever say that it disengaged, probably milliseconds before the accident, and not give any further details.

Re: (Score:2, Insightful)

by geekmux ( 1040042 )

> It's refreshing for a company to be this open and honest about what happened. When Tesla autopilot kills someone they will only ever say that it disengaged, probably milliseconds before the accident, and not give any further details.

> The report notes that the car was a "so-called standard version of the SU7, which has the less-advanced smart driving technology without LiDAR."

Sorry to burst your biased bubble here, but that isn't exactly a "refreshing" corporate response. That is a disgusting excuse, suggesting customers might have been saved if only they wouldn't have been so "cheap" and upgraded beyond the "standard" version.

Re: (Score:2)

by AmiMoJo ( 196126 )

The summary seems to have mislead you. Xiaomi did not blame the customer, that was the journalist noting that the model of car they had was one without lidar.

Probably because Tesla is a vision-only system and has similar issues.

Re: (Score:2)

by geekmux ( 1040042 )

> The summary seems to have mislead you. Xiaomi did not blame the customer, that was the journalist noting that the model of car they had was one without lidar.

> Probably because Tesla is a vision-only system and has similar issues.

If the company itself has created and offered different levels of autonomous solutions, then you will find that Xiaomi will eventually blame the customer.

As nothing more than a legal defense.

Watch, and see.

Re: (Score:2)

by dstwins ( 167742 )

Actually it does make sense..

A lot of the "cheaper" versions are just pretty much "advanced" cruise control and so should not be used for auto driving.. (of course its not marketed that way, but its an important distinction because its the computer version of "I can't find my glasses" (while driving down the highway)) In fact, I wish cars would just say that (like my mother used to) since that tells YOU to be alert and watch the road.

Re: (Score:2)

by geekmux ( 1040042 )

> Actually it does make sense.. A lot of the "cheaper" versions are just pretty much "advanced" cruise control and so should not be used for auto driving.. (of course its not marketed that way, but its an important distinction because its the computer version of "I can't find my glasses" (while driving down the highway)) In fact, I wish cars would just say that (like my mother used to) since that tells YOU to be alert and watch the road.

We will ultimately find (through litigation) that humans are far too stupid to understand ANY marketing behind "auto" drive/pilot/cruise-anything, and will force companies to stop offering assisted solutions of ANY kind until autonomous solutions are good enough to not even require a licensed driver behind the wheel.

Naturally when that happens, human override won't even be an available option anymore. Humans, get what they deserve.

Re: (Score:2)

by kriston ( 7886 )

My Honda's optional lane-keeping feature is pretty good. So is the on-by-default road and lane departure features along with automatic emergency braking. Adaptive cruise control is a godsend. Parking proximity warnings and backup cross-traffic alerts are very useful.

Are you saying these features should all be deleted?

Re: (Score:2)

by RockDoctor ( 15477 )

I don't understand this (EN_US - ?) idiom.

> the computer version of "I can't find my glasses" (while driving down the highway)

What does it mean?

I may be biased having worn glasses since before puberty, got used to putting them on in the same set of actions as turning off the alarm clock and disentangling the other arm from under the wife, and almost never seeing the world except through a set of glasses.

(To forestall some comments : yes, I haver tried contact lenses - see "almost all" not "all" above. Yes,

Re: (Score:3)

by Firethorn ( 177587 )

I've seen further details out of tesla a number of times, in more detail than this even. But it does take time for them to do the analysis, and normally the news stations don't pick it up because it is normally no longer fresh news.

Re: (Score:2)

by Dan East ( 318230 )

> It's refreshing for a company to be this open and honest about what happened.

We are talking about a Chinese State-owned company here, right? And you are really believing they are being open and honest? Exactly would a communist-owned company gain by being open and honest?

How do you know there haven't been thousands of other people killed due to this autopilot failing in China so far?

Re: (Score:2)

by SuperDre ( 982372 )

Xaomi hasn't really given any more information about what happened, just like Tesla also never really says.

Humans can't take over (Score:5, Interesting)

by drinkypoo ( 153816 )

The whole point of having a car "drive itself" is that you aren't doing it.

Expecting someone to go from not doing it at all to doing it at full highway speed immediately is bananas.

The correct action is usually lean back and nail the brakes, and let ABS and crumple zones do what they will. In "a few seconds" the vehicle could ostensibly have done that for them and the outcome would have been better. The vehicle could have reduced speed more than that in three seconds, if it chose to do that instead of shutting off and leaving the human in charge.

Re: (Score:3)

by MBGMorden ( 803437 )

I have to agree with you there. So long as they're expecting the human to take control "if needed", these systems will be impractical. I can accept that they might not be 100% accurate all the time and accidents, even fatal ones, are inevitable. The incidence of them just needs to be less than that of a human drivers to be acceptable. If they can't drive autonomously without human input though then to me its pointless.

I'll buy one when they get to a point where they don't even have manual controls anymo

Re: (Score:2)

by Joe_Dragon ( 2206452 )

that is put it so they don't have to pay out anything.

Re: Humans can't take over (Score:2, Funny)

by databasecowgirl ( 5241735 )

Aye, the only point of a self driving car is so you can stay home while the car does the errands.

If it can't drop the kids off at piano lessons and pick up the groceries from hypermarket on it's own, what's the point?

Re: (Score:2)

by Firethorn ( 177587 )

Picking up groceries leads to an interesting thought.

What if instead of you sending your car to do it, the hypermarket has a van, perhaps something similar to a schwan truck with refrigerated and freezer sections, and it delivers to your whole neighborhood that day?

Heck, consider a self-driving pizza delivery vehicle, with pizza ovens inside.

Re: (Score:3)

by sinij ( 911942 )

You have to consider [1]Type I and Type II [wikipedia.org] errors. Taking over is fine in Type I error, but completely unrealistic in other scenarios. If self-driving system takes drastic actions every time, then it will cause accidents in Type I scenario.

[1] https://en.wikipedia.org/wiki/Type_I_and_type_II_errors

Re:Humans can't take over (Score:4, Insightful)

by AmiMoJo ( 196126 )

Like aeronautical autopilot systems, these are driver aids. They are great in situations like a long cruise down a highway, where they take care of making tiny adjustments to keep the car in the centre of the lane and following the curves, and in stop-start traffic. The same sort of thing pilots use them for - hands off the flight controls, let the autopilot maintain altitude and heading, or follow the glide slope in to land.

The idea with the aircraft systems is that not only is the autopilot very good at those things, it reduces the workload on the pilots so that they can concentrate on monitoring other things, or be more rested and alert when they need to take over.

You make a fair point about the automatic braking though. It depends why it disconnected, it could have been because the driver took over, rather than because it detected a situation that it couldn't cope with.

Re: Humans can't take over (Score:2)

by getuid() ( 1305889 )

The problem isn't that it wouldn't have been right to do as you say this time , it's that it's not a good response every time .

Sometimes there really is nothing wrong, it's justvthe perception of the robot / car that's off. And the car "knows" that something's off, just can't judge what exactly. Suddenly braking might turn a perfectly safe situation into a dangerous one (e.g. if driving in dense traffic).

Damned if it stops, damned if it doesn't... well, that's why it's called " not fully self-driving capable"

Re: (Score:2)

by larryjoe ( 135075 )

> The whole point of having a car "drive itself" is that you aren't doing it.

> Expecting someone to go from not doing it at all to doing it at full highway speed immediately is bananas.

ADAS and Level 2 requires constant eyes on the road to take over immediately. Level 3 does not have an immediate takeover requirement. The only commercially offered Level 3 cars only work at low speeds on limited access highways and allow several seconds for human takeover. That's also why all Tesla cars are Level 2.

Su-7 (Score:2)

by Latent Heat ( 558884 )

[1]https://en.wikipedia.org/wiki/... [wikipedia.org]

Autopilot, yes, but probably dangerous to operate?

[1] https://en.wikipedia.org/wiki/Sukhoi_Su-7

What's most important here: (Score:2)

by Gravis Zero ( 934156 )

What's most important here is if they actually learn why this accident happened in the first place and how to prevent it from happening again. If nothing was learned then the people that died have died in vain. If the data from the collision can be used to improve the autiopilot so that it will avoid similar situations then this may have been the unfortunate cost of progress. This may seem cold but the unfortunate reality is that people die in car accidents every day. The problem we've faced is that all the

At least they call it driver assist (Score:2)

by caseih ( 160668 )

At least they are calling it a form of driver assist, which is what it is, just like cruise control, or lane-keeping. Requires the driver to still be actively in charge of the car's operation, at least in a supervisory capacity. Musk's refusal to stop calling their driver-assist system full self driving has always been disingenuous. China has banned Tesla's "Full self driving" feature because of this.

As far as the car itself goes, Ford CEO has been driving an SU7 for six months as his daily driver in the

Re: (Score:2)

by SuperDre ( 982372 )

Wait a minute, Tesla has 2 options, FSD and Autopilot, which are completely different options. Tesla's Autopilot is what comparable to the system mentioned in the article.

So before bitching on Tesla again, make sure you are talking about the same options.

FSD is not banned, it just isn't certified yet, just like there is no country in the world where FSD is certified.

Driverless Vehicles-Solving Paying Drivers Problem (Score:2)

by BrendaEM ( 871664 )

A lot of people like to drive. People would not want to spend their life savings to restore a classic car--if they did not like to drive. While people don't want to drive to to work, much of that dislike is being in a dangerous vehicle. I do not think that driver-less vehicles solve that problem. DARPA first did a driverless vehicle challenge in the desert because that where they needed to kill people, like the DARPA challenge, driverless vehicles are an instrument against people. Driverless vehicles likely

Scared? Humans are far worse. (Score:2)

by backslashdot ( 95548 )

This is scary, but do remember that 260,000 people die in Chinese traffic accidents every year. Reference: [1]https://www.scmp.com/news/chin... [scmp.com] (the US number is 40,000 btw) due to an error a human made. The vast majority of those dying are not the at-fault driver, but some other human .. a passenger, other driver, or pedestrian. And let's not get into debilitating injuries. We really need computers to take over driving ASAP.

[1] https://www.scmp.com/news/china/society/article/1952218/traffics-toll-road-accidents-kill-700-people-day-china

Re: (Score:2)

by Viol8 ( 599362 )

"We really need computers to take over driving ASAP."

No we don't. Life has risks, get over it. You might as well say we should all be an AI driven wheelchairs instead of walking just in case we do something stupid like step out in front of a vehicle.

Generally it tends to be people who don't or can't drive who shill the most for fully automated vehicles which once they do become common will be a signal for insurance companies to jack up the price of human driven vehicles to levels people can't afford or for

LIDAR should be required by law (Score:2)

by kriston ( 7886 )

LIDAR should be required by law for "autonomously" driving cars.

The Tesla crashing into the Wile E Coyote painted tunnel is even more proof of this.

Sometimes even to live is an act of courage.
-- Seneca