Xiaomi SU7 involved in fatal crash, assisted driving feature questioned

Electric cars don’t consume gasoline, but they are more prone to catching fire and can’t stop when they crash.
The sales hype claims autonomous driving doesn’t need a driver, but when something goes wrong it becomes clear that this is actually driver assistance and you shouldn’t rely on it completely.

The above is not directed at Xiaomi.

Autonomous driving technology should be prohibited on public roads.

1 Like


Cone barrel Water horse Tripod does not trigger AEB :sweat_smile:

I have also heard that people around me use assisted driving on the highway; with my understanding of AI, just thinking about it feels terrifying.

Assisted driving involves the driver waiting to intervene, which is dangerous from the start. Manual control keeps the driver alert; if assisted driving is continuous and no intervention is needed, psychologically the driver will inevitably become lax.

A few years ago I encountered an extremely dangerous situation: driving on the highway in the evening, a goat crossed ahead, and I almost hit it. Some unusual incidents are hard to imagine.

3 Likes

https://archive.ph/thyKq

Indeed, this reminds me of the rule/data‑based autonomous driving systems from a few years ago, which, when encountering situations outside the training data, resulted in crashes and fatalities.

I haven’t examined the details of this news closely, but since the feature is called “driver assistance”, it somewhat indicates that even major car manufacturers cannot guarantee its flawless “automation”. For the sake of one’s own safety, especially on high‑speed, hazardous road sections, it is absolutely unacceptable to let AI take over.