this post was submitted on 05 Jul 2025
22 points (95.8% liked)

Hacker News

1923 readers
576 users here now

Posts from the RSS Feed of HackerNews.

The feed sometimes contains ads and posts that have been removed by the mod team at HN.

founded 9 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] iAvicenna@lemmy.world 5 points 1 day ago* (last edited 1 day ago) (1 children)

LLMs are good to get keywords that may be relevant to a topic you are interested in (but don't know much about) and then you can go search for it in a more targeted manner. Unfortunately google has become so bad that LLM actually gives more relevant answers to vague questions. Then the keywords in those answers often get you where you want to go with more research and overall shorten the time to get there.

For instance if you ask it a simple coding question, it generally suggests correct functions, libraries etc to use even though the code may be buggy. So it makes up for a good starting point for your search.

That being said I am not sure this is worth the long term damage the AI industry might cause.

[โ€“] lvxferre@mander.xyz 2 points 1 day ago

It's basically my experience with translation, too: asking a LLM is a decent way to look for potential ways to translate a specific problematic word, so you can look them up in a dic and see which one is the best. It's also a decent way to generate simple conjugation/declension tables. But once you tell it to translate any chunk of meaningful text, there's a high chance it'll shit itself, and output something semantically, pragmatically, and stylistically bad.