this post was submitted on 04 May 2025
51 points (76.8% liked)

Asklemmy

47959 readers
320 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Geometrinen_Gepardi@sopuli.xyz 1 points 1 week ago (1 children)

How do you get it to search the internet?

[–] brucethemoose@lemmy.world 2 points 5 days ago* (last edited 5 days ago)

The front end.

Some UIs (Like Open Web UI) have built in “agents” or extensions that can fetch and parse search results as part of the context, allowing LLMs to “research.” There are in fact some finetunes specializing in this, though these days you are probably best off with regular Qwen3.

This is sometimes called tool use.

I also (sometimes) use a custom python script (modified from another repo) for research, getting the LLM to search a bunch of stuff and work through it.

But fundamentally the LLM isn’t “searching” anything, you are just programmatically feeding it text (and maybe fetching its own requests for search terms).

The backend for all this is a TabbyAPI server, with 2-4 parallel slots for fast processing.