this post was submitted on 19 Mar 2025
1493 points (98.3% liked)

Not The Onion

15287 readers
944 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

In the piece — titled "Can You Fool a Self Driving Car?" — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

you are viewing a single comment's thread
view the rest of the comments
[–] comfy@lemmy.ml 142 points 3 days ago (23 children)

I hope some of you actually skimmed the article and got to the "disengaging" part.

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

[–] PersnickityPenguin@lemm.ee 6 points 2 days ago (3 children)

Yeah but that's milliseconds. Ergo, the crash was already going to happen.

In any case, the problem with Tesla autopilot is that it doesn't have radar. It can't see objects and there have been many instances where a Tesla crashed into a large visible object.

[–] sudo@programming.dev 3 points 2 days ago (2 children)

That's what's confusing me. Rober's hypothesis is without lidar the Tesla couldn't detect the wall. But to claim that autopilot shut itself off before impact means that the Tesla detected the wall and decided impact was imminent, which disproves his point.

If you watch the in car footage, autopilot is on for all of three seconds and by the time its on impact was already going to happen. That said, teslas should have lidar and probably do something other than disengage before hitting the wall but I suspect their cameras were good enough to detect the wall through lack of parallax or something like that.

[–] Amm6826@lemmy.ml 8 points 2 days ago

Or it still may have short distance sensors for parking and that if it sees something solid on those it disables autopilot?

load more comments (1 replies)
load more comments (1 replies)
load more comments (20 replies)