this post was submitted on 07 May 2025
611 points (100.0% liked)

TechTakes

1833 readers
752 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] RvTV95XBeo@sh.itjust.works 49 points 15 hours ago (27 children)

Maybe I'm just getting old, but I honestly can't think of any practical use case for AI in my day-to-day routine.

ML algorithms are just fancy statistics machines, and to that end, I can see plenty of research and industry applications where large datasets need to be assessed (weather, medicine, ...) with human oversight.

But for me in my day to day?

I don't need a statistics bot making decisions for me at work, because if it was that easy I wouldn't be getting paid to do it.

I don't need a giant calculator telling me when to eat or sleep or what game to play.

I don't need a Roomba with a graphics card automatically replying to my text messages.

Handing over my entire life's data just so a ML algorithm might be able to tell me what that one website I visited 3 years ago that sold kangaroo testicles was isn't a filing system. There's nothing I care about losing enough to go the effort of setting up copilot, but not enough to just, you know, bookmark it, or save it with a clear enough file name.

Long rant, but really, what does copilot actually do for me?

[–] Honytawk@feddit.nl 1 points 8 hours ago (3 children)

How about real-time subtitles on movies in any language you want that are always synced?

VLC is working on that with the use of LLMs

[–] Dragonstaff@leminal.space 5 points 5 hours ago

We've had speech to text since the 90s. Current iterations have improved, like most technology has improved since the 90s. But, no, I wouldn't buy a new computer with glaring privacy concerns for real time subtitles in movies.

[–] zurohki@aussie.zone 5 points 5 hours ago (1 children)

I tried feeding Japanese audio to an LLM to generate English subs and it started translating silence and music as requests to donate to anime fansubbers.

No, really. Fansubbed anime would put their donation message over the intro music or when there wasn't any speech to sub and the LLM learned that.

[–] blakestacey@awful.systems 4 points 3 hours ago

All according to k-AI-kaku!

[–] Bytemeister@lemmy.world 1 points 3 hours ago

You're thinking too small. AI could automatically dub the entire movie while mimicking the actors voice while simultaneously moving their lips and mouth to form the words correctly.

It would just take your daily home power usage to do a single 2hr movie.

load more comments (23 replies)