this post was submitted on 10 Jun 2025
-12 points (43.9% liked)
Technology
71271 readers
4101 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Because of studies like https://arxiv.org/abs/2211.03622:
Seems like this is a good argument for specialization. Have AI make bad but fast code, pay specialty people to improve and make it secure when needed. My 2026 Furby with no connection to the outside world doesn't need secure code, it just needs to make kids smile.
They're called programmers, and it's faster and less expensive all around to just have humans do it better the first time.
Have you talked to any programmers about this? I know several who, in the past 6 months alone, have completely changed their view on exactly how effective AI is in automating parts of their coding. Not only are they using it, they are paying to use it because it gives them a personal return on investment...but you know, you can keep using that push lawnmower, just don't complain when the kids next door run circles around you at a quarter the cost.
Automating parts of something as a reference tool is a WILDLY different thing than differing to AI to finalize your code, which will be shitcode.
Anybody right now who is programming that is letting AI code out there is bad at their job.
No argument there.
Have you had to code review someone who is obviously just committing AI bullshit? It is an incredible waste of time. I know people who learned pre-LLM (i.e. have functioning brains) and are practically on the verge of complete apathy from having to babysit ai code/coders, especially as their management keeps pushing people to use it. As in, they must use LLM as a performance metric.
congratulations on offloading your critical thinking skills to a chatbot that you most likely don't own. what are you gonna do when the bubble is over, or when dc with it burns down
That push lawnmower will still mow the lawn in decades to come though, while your kids fancy high-tech lawnmower will explode in a few months and you're lucky if it doesn't burn the entire house down with it.