this post was submitted on 11 Mar 2025
35 points (100.0% liked)
PC Master Race
17704 readers
1 users here now
A community for PC Master Race.
Rules:
- No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
- Be respectful. Everyone should feel welcome here.
- No NSFW content.
- No Ads / Spamming.
- Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.
Notes:
- PCMR Community Name - Our Response and the Survey
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Thank you for that! I’ll look at the AMD. I thought most ML tools don’t have outright compatibility with AMD though? Is that no longer the case?
The 4060 Ti 16 GB version… that sounds good. About $500?
AMD's compatibility has gotten better the past year, and ones that aren't compatible usually have workarounds. But yeah, Nvidia is probably better if LLMs are important for you.
More like ~$800-900 unless you're okay with buying used. The market is pretty darn bad, and it's gotten SO much worse due to the tariff scare. Like I said, you're better off waiting a few months if you can.
I don’t shy away from buying refurb electronics. But is there a problem with buying used GPUs?
Not looking forward to buying new during this tariff era. So perhaps my local marketplaces might be best…
I've bought used for my last 2 gpus, no issues.
Not really. You just have to make sure you got what you ordered by doing a bunch of tests. Plus, you usually don't get a warranty.
https://github.com/ollama/ollama/blob/main/docs/gpu.md
Ollama supports AMD and Nvidia GPUs, including your existing 1080
Wow; thanks for finding that doc. Yeah, I’ve made it work on my system. But I’d like to use some of the bigger models.