this post was submitted on 03 May 2025
1425 points (99.3% liked)
memes
14521 readers
4120 users here now
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- !tenforward@lemmy.world : Star Trek memes, chat and shitposts
- !lemmyshitpost@lemmy.world : Lemmy Shitposts, anything and everything goes.
- !linuxmemes@lemmy.world : Linux themed memes
- !comicstrips@lemmy.world : for those who love comic stories.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
From their example, seems like all they’ve “innovated” is a new, less reliable way to write database queries !
Yep. And query languages being some of the quickest and fastest things an analyst can do with 100% knowledge of the data and any wrangling/conditions that need to be done to assure accurate results.
A bot would never be able to accurately answer these questions off my data unless I thoroughly trained and tested it. But if it's GPT-based, I'd always have to double-check so it'd just be a hinderence in workflow. There is no way money would be paid to a third-party for such a situation.
Since there’s a mathematical proof that LLMs without hallucinations are impossible, I think this kind of usage is a lost cause.