catty

joined 4 days ago
[–] catty@lemmy.world 2 points 10 hours ago (1 children)

Doesn't Twitch own all data that is written and their TOS will state something like you can't store data yourself locally.

[–] catty@lemmy.world 3 points 17 hours ago (3 children)

No, what is it? How do I try it?

[–] catty@lemmy.world 1 points 18 hours ago (1 children)

Surely none of that uses a small LLM <= 3B?

 

I've tried coding and every one I've tried fails unless really, really basic small functions like what you learn as a newbie compared to say 4o mini that can spit out more sensible stuff that works.

I've tried explanations and they just regurgitate sentences that can be irrelevant, wrong, or get stuck in a loop.

So. what can I actually use a small LLM for? Which ones? I ask because I have an old laptop and the GPU can't really handle anything above 4B in a timely manner. 8B is about 1 t/s!

[–] catty@lemmy.world 1 points 22 hours ago

That's not the point here. People probably do not need a pi 5. There are many other pi devices (and similar boards) with significantly less draw.

[–] catty@lemmy.world 0 points 1 day ago

I like how all the big media just happened to publish articles about "peaceful protests" to pacify the protestors, "please don't be violent everyone, the police can't shoot you all in the leg with rubber bullets".

The violent protests are the successful ones. Just ask every country, ever.

[–] catty@lemmy.world 1 points 1 day ago

Damn, I should have ended the post with /s for people like you.

[–] catty@lemmy.world -1 points 1 day ago

See here's the thing. Why would anyone want to host ALL the stuff on one pi? That is not what they were designed for. Ollama on a pi? Are you out of your mind? I'd run the biggest model I can on a modern gpu not some crappy old computer or pi....Right tool, right job. And why is dropping containers "less secure"? Do you mean "less cool"? Less easy to deploy? But you're not deploying it, you're installing it. You sound like a complete newb which is fine, but just take a step back from things and get some more experience. A pi is a tool for a purpose, not the end all. Using an old laptop is not going to save the world and arguing that it's just better than a pi (or similar alternative) is just dumb. Use a laptop for all I care, I'm not the boss of you.

As for an arr stack, I'm really disappointed with the software and don't use it and those who do have way too much time to set it up, and then make use of it!

[–] catty@lemmy.world 1 points 2 days ago (2 children)

I can self host what I want on a pi zero. But, I do have some 30 years of experience so can probably do things some won't understand / bother with.

[–] catty@lemmy.world 4 points 2 days ago (3 children)

I'm sure silicon valley are stepping on each other, vying to get their hands on these super cheap laptops for their 24/7 AI training.

[–] catty@lemmy.world 1 points 2 days ago* (last edited 2 days ago)

It's even worth pointing out you can disable various parts of the pi so it uses / needs even less juice.

[–] catty@lemmy.world 1 points 2 days ago (1 children)

Pi’s are ARM-based, which still to this day limits the scope of their applicability.

Untrue.

Also, you should absolutely inspect a laptop before buying. Many, if not most, of old laptops will run just fine for the next few years.

Until the battery needs replacing, costing more than a pi, one key on the keyboard dies, etc.

[–] catty@lemmy.world 0 points 2 days ago (4 children)

Please be specific rather than referring to 'raspberry pis' together. Different models have way different characteristics.

 

I was looking back at some old lemmee posts and came across GPT4All. Didn't get much sleep last night as it's awesome, even on my old (10yo) laptop with a Compute 5.0 NVidia card.

Still, I'm after more, I'd like to be able to get image creation and view it in the conversation, if it generates python code, to be able to run it (I'm using Debian, and have a default python env set up). Local file analysis also useful. CUDA Compute 5.0 / vulkan compatibility needed too with the option to use some of the smaller models (1-3B for example). Also a local API would be nice for my own python experiments.

Is there anything that can tick the boxes? Even if I have to scoot across models for some of the features? I'd prefer more of a desktop client application than a docker container running in the background.

 

I'm watching some retro television and this show is wild! Beauty contests with 16 year-old girls (though at the time, it was legal for 16 yo girls to pose topless for newspapers), old racist comedians from working men's clubs doing their routine, Boney M, English singers from the time, and happy dance routines!

vid

view more: next ›