this post was submitted on 21 Feb 2025
32 points (90.0% liked)

PC Gaming

9572 readers
839 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 8 comments
sorted by: hot top controversial new old
[–] ShinkanTrain@lemmy.ml 15 points 1 day ago* (last edited 1 day ago)

A little googling shows the S80 has (had?) really bad drivers that caused it to be outmatched by much weaker GPUs, so the 120% number isn't as crazy as you might first expect.

Hopefully we get more competitive GPUs in the future. On paper that's already a pretty decent card (16 GB too!)

[–] altima_neo@lemmy.zip 6 points 1 day ago (2 children)

Now if only they could compete with Nvidia on a performance level, it might give Nvidia the kick in the ass it needs to not be so anti consumer

[–] Tattorack@lemmy.world 8 points 1 day ago (1 children)

I think NVidia is already getting a kick in the ass.

The first GPU I bought was a GTX 1060 with 6GB. A legendary card I kept using until just last year November.

What did I upgrade to?

Why Intel of course. The A770 is cheaper than a AMD of the same performance range, and has a weird quirk where it actually does better at 1440p than similar cards. Very likely the spacious VRAM, which is also nice to have for the 3D work I do.

I didn't upgrade past the 1060 earlier because the 20 series wasn't that big enough of a leap, and the 30 series is where a lot of Nvidia's bullshit started.

And for the industrial market $ per performance is all that matters because in large deployments there is no issue with just parallelizing as many GPUs as you want. Even if an intel GPU for a 10th of the price has a 5th of the performance, then you just slap together 5 of them and get the same processing power for half the price.

[–] yonder@sh.itjust.works 5 points 1 day ago

Both Intel and AMD are trying to eat into Nvidia's market share, and are arguably failing at that currently. Even though both AMD and Intel have cards that are better than Nvidia's in specific cases, Nvidia keeps their market share, most likely due in part to CUDA and DLSS being locked to Nvidia.

[–] Tattorack@lemmy.world 1 points 1 day ago (1 children)

Can someone tell me about these cards? I've literally never heard of them before now. Obviously they're not big performers, but what are they like?

[–] empireOfLove2@lemmy.dbzer0.com 6 points 1 day ago* (last edited 1 day ago)

I don't normally condone watching Linus Tech Tips but they have a good video on the Moore Threads GPU's: https://www.youtube.com/watch?v=YGhfy3om9Ok

Also GN is a better source: https://www.youtube.com/watch?v=qPMptuEokPQ

They're basically home-grown Chinese silicon that have pretty good looking raw performance specs at the cost of high power consumption, but have the most atrocious hacked together drivers that made them effectively useless. They're definitely getting better as they learn how to write driver code for most games.

[–] Hubi@feddit.org 1 points 1 day ago

More competition is good news and from what I've seen, these GPUs offer good value for the price. NVidia has been too stingy with VRAM and I hope that other companies finally begin to step up.