this post was submitted on 16 Mar 2025
1722 points (99.1% liked)

Not The Onion

15043 readers
1732 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.

Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.

The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!

(page 2) 50 comments
sorted by: hot top controversial new old
[–] nichtburningturtle@feddit.org 48 points 1 day ago (1 children)
load more comments (1 replies)
[–] happydoors@lemm.ee 52 points 1 day ago (1 children)

I love that one of the largest YouTubers is the one that did this. Surely, somebody near our federal government will throw a hissy fit if he hears about this but Mark’s audience is ginormous

[–] buddascrayon@lemmy.world 33 points 1 day ago (1 children)

Honestly I think Mark should be more scared of Disney coming after him for mapping out their space mountain ride.

[–] PraiseTheSoup@lemm.ee 10 points 1 day ago (1 children)

He probably just made Disney admissions and security even more annoying for everyone else.

load more comments (1 replies)
[–] rational_lib@lemmy.world 64 points 1 day ago* (last edited 1 day ago) (8 children)

The rain test was far more concerning because it's much more realistic of a scenario. Both a normal person and the lidar would've seen the kid and stopped, but the cameras and image processing just isn't good enough to make out a person in the rain. That's bad. The test portrays it as a person in the middle of a straight road, but I don't see why the same thing wouldn't happen at a crosswalk or other place where pedestrians are often in the path of a vehicle. If an autonomous system cannot make out pedestrians in the rain reliably, that alone should be enough to prevent these vehicles from being legal.

[–] PlaidBaron@lemmy.world 26 points 1 day ago

Who owns the White House right now?

load more comments (7 replies)
[–] Gammelfisch@lemmy.world 12 points 1 day ago

Yep, I could see someone placing a billboard like that with a cliff behind it.

[–] captain_aggravated@sh.itjust.works 281 points 2 days ago (5 children)

OMFG someone test to see if Teslas stop to eat free bird seed.

load more comments (5 replies)
[–] vga@sopuli.xyz 24 points 1 day ago
[–] futatorius@lemm.ee 16 points 1 day ago (1 children)

Suddenly, there are more Yellow Brick Road murals everywhere.

[–] icecream@lemmy.world 7 points 1 day ago* (last edited 1 day ago) (2 children)

A building owner would not want cars crashing into their property though. Why would they get a mural to intentionally deceive a robot car?

[–] Retropunk64@lemm.ee 7 points 1 day ago (2 children)

Because its fucking funny.

load more comments (2 replies)
[–] King3d@lemmy.world 46 points 1 day ago (14 children)

This is like the crash on a San Francisco bridge that happened because of a Tesla that went into a tunnel and it wasn’t sure what to do since it went from bright daylight to darkness. In this case the Tesla just suddenly merged lanes and then immediately stopped and caused a multi car pile up.

load more comments (14 replies)
[–] Iheartcheese@lemmy.world 179 points 2 days ago (1 children)

It got fucking wile e coyoted

load more comments (1 replies)
[–] fubarx@lemmy.world 47 points 1 day ago (15 children)

There's a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:

Congress will pass a law that makes NOBODY liable -- as long as a human wasn't involved in the decision making process during the incident.

This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can't be held liable. 🤷🏻‍♂️

Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!

load more comments (15 replies)
[–] CPMSP@midwest.social 38 points 1 day ago (2 children)
load more comments (2 replies)
[–] Duke_Nukem_1990@feddit.org 78 points 2 days ago

TIL Mark Rober is a domestic terrorist

[–] jet@hackertalks.com 142 points 2 days ago* (last edited 2 days ago) (30 children)
load more comments (30 replies)
load more comments
view more: ‹ prev next ›