this post was submitted on 06 Aug 2025
128 points (98.5% liked)

PC Gaming

11987 readers
311 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 31 comments
sorted by: hot top controversial new old
[–] ISolox@lemmy.world 1 points 3 hours ago

I was going to get a 9070XT.... When it was MSRP.

Couldn't find one at launch and that MSRP is a myth now :(

[–] nuko147@lemmy.world 10 points 2 days ago (2 children)

I'm thinking my next GPU upgrade might be AMD, so i can migrate to Linux. But i bought a used one in 2023, so i gonna wait at least until the next gen comes out.

[–] zod000@lemmy.ml 4 points 2 days ago (2 children)

As others have said, Nvidia cards are OK in Linux now, though AMD is better assuming you aren't doing anything AI adjacent with the card.

[–] DegenerateSupreme@lemmy.zip 1 points 1 day ago

I'm so confused by this common sentiment in the community. I've been gaming on Arch / NixOS for the past several months with an NVIDIA card after I switched earlier this year. Basically no issues.

Meanwhile, my buddy converted to Manjaro, and has a Radeon. He's been having awful issues. Several of the games he plays crash constantly, especially if they are multiplayer. He tried switching to openSUSE recently; no real improvements.

I wanted to buy AMD for my eventual next card, but now I'm terrified of doing so, and deeply confused why everyone says AMD is better for Linux.

[–] BeardedGingerWonder@feddit.uk 2 points 2 days ago

Until you want to update drivers and you can't remember how the hell you installed it the last time.

[–] ikidd@lemmy.world 2 points 2 days ago (1 children)

nVidia's fine. Use a distro that keeps fairly current on driver updates like Fedora or Arch.

[–] nuko147@lemmy.world 2 points 2 days ago (1 children)

I think you lose around 15%+ performance in Linux with a Nvidia card. I tried Nobara and Bazzite and i can confirm it, although i did not make any hardcore benchmarking.

AMD i think is on par with windows or maybe slightly worse.

[–] zod000@lemmy.ml 3 points 2 days ago

That tracks with what I have seen with my own cards. I suspect some of it is because there were specific driver versions that many distros favored for stability reasons that may have left some performance on the table, but that's just my personal speculation.

[–] SpikesOtherDog@ani.social 3 points 2 days ago (1 children)

My target right now is to play Expedition 33 on something other than N64 quality.

[–] WereCat@lemmy.world 0 points 1 day ago (1 children)

Good luck with that. The game has terrible rendering issues. Enjoy it for its gameplay.

[–] SpikesOtherDog@ani.social 1 points 1 day ago (1 children)

Oh, it is not well optimized. DMC Devil May Cry is UE 3 and runs flawlessly on my system on high with 75+ fos. Warframe runs great. Claire Obscura struggles to maintain 10fps without hacks.

[–] WereCat@lemmy.world 1 points 1 day ago (1 children)

I'm not talking about performance or optimization. This game has nice art style that is ruined by how terribly rendered it is. Flickering, shimmering, cascading, etc... absolutely EVERYWHERE.

[–] SpikesOtherDog@ani.social 1 points 1 day ago (1 children)

Oh. I turn that off when possible.

[–] WereCat@lemmy.world 1 points 13 hours ago* (last edited 13 hours ago) (1 children)

Turn off what? The whole game?

[–] SpikesOtherDog@ani.social 1 points 6 hours ago (1 children)

I was referring to excessive bloom, motion blur, film grain, whatever.

I'm no stranger up constant flickering, flashing, and shimmering. Is it worse than Warframe?

[–] WereCat@lemmy.world 1 points 2 hours ago

I turn those off in every game and this has nothing to do with those. I don't play Warframe.

[–] inclementimmigrant@lemmy.world 16 points 3 days ago (2 children)

Yeah and none of them are available at MSRP so they can go piss off.

[–] DacoTaco@lemmy.world 6 points 2 days ago* (last edited 2 days ago)

* in the usa

Last i checked europe was pretty close to msrp, which is weird because its usually a lot more expensive for us than it is for the usa. Just checked, rx 9700 xt msrp was around 670 euro (including tax). They are now for sale for 699-717 euro, including what usa calls sales tax.

I call that as close to msrp that youre ever going to get on a product.

[–] SpikesOtherDog@ani.social 11 points 3 days ago (1 children)

I just purchased a sapphire 16g 9060 for $380. That's almost 10% mark-up, but definitely not crazy compared to what we were seeing with scalpers.

[–] Schmuppes@lemmy.today 7 points 2 days ago* (last edited 2 days ago) (1 children)

Even at MSRP and 0% markup, those cards are too expensive. A mid-range GPU shouldn't cost as much as (or more than) CPU, mainboard and RAM of a good, but affordable system combined.

[–] SpikesOtherDog@ani.social 8 points 2 days ago (1 children)

I very much agree that the gpu market is inflated. That is why I practically went 10 years between purchases and only bought a mid range. My previous card before that was a 3dfx voodoo.

[–] Schmuppes@lemmy.today 3 points 2 days ago* (last edited 2 days ago)

My previous card was a Vega 56, which cost me 399 €. I was lucky to get one during launch week, at MSRP despite the mining craze back then. It was a fantastic card in retrospect and lasted me more than 7 years with my 1440p/70 monitor.

I now own a 1440p/144 monitor, so my target framerate at the same resolution is a bit higher than it used to be, but I'm happy with somewhere consistently between 60 and 90 fps. The replacement for my Vega 56 is a 7900 XT (didn't wanna wait for the 9070 XT earlier this year), which still is not a proper 4K card if you're after High or Ultra settings. It does 1440p very well, but it absolutely is not a huge leap in technology compared to Vega. I got it for 600 € open box, regular price (after two years on the market mind you) was more like 700-750 € even right before the new gen dropped. I know everything's gotten more expensive between 2017 and 2025, but an 80 % price increase for a card with a smaller die size and regular GDDR memory is insane. After all, both Vega 56 and 9070 XT are pretty much comparable in their market position when they were introduced.

I've ditched AAA gaming as a hobby before for several years, until the announcement of Fallout 4 made me build a new, potent system. If the "been there, done that" feeling returns when I play new releases in the future, I'm not sure I can find motivation to pick that hobby up again further down the road. Ever increasing hardware cost and capitalist enshittification of the games industry might kill it for good as far as I'm concerned.

[–] 46_and_2@lemmy.world 2 points 2 days ago

I'm sure they're selling plenty, but I've never seen supply missing, in the EU at least.

[–] HK65@sopuli.xyz 5 points 3 days ago (3 children)

What does a 9000 series have on a 7000? Why do people even buy new cards at this point? Anything can run anything almost.

[–] vagullion@lemmy.world 19 points 3 days ago

Better raytracing and FSR 4.

[–] Zomg@piefed.world 10 points 3 days ago (1 children)

I think mainly better ray tracing for those that want it.

People buy new cards because their games require something better to run smoothly. Some people still use 1660ti's and I'm sure are happy to upgrade their hardware.

[–] HK65@sopuli.xyz 3 points 3 days ago (2 children)

I mean sure, but people on very old cards are finite.

I'm asking what makes a 9000 series a more attractive buy than a 7000?

On rereading the headline, it does not say much either, demand for GPUs manufactured by me in my garage is also outpacing supply.

[–] Zomg@piefed.world 9 points 3 days ago

Ask the people that held off on the 7000 series I guess. People don't buy on the same upgrade frequency as others. We're all at different phases of our hardware refresh cycles.

[–] Beacon@fedia.io 6 points 3 days ago

In fact demand for GPUs manufactured by you in your garage has just gone from outpacing supply by another 100% because i too would be interested in your personally made microchips. They must be very small batch and artisanal!

[–] zod000@lemmy.ml 2 points 3 days ago (1 children)

The 9000 cards don't offer much of value other than running a bit cooler IMO. The fact is that the good 7000 cards aren't really available in retail anymore, so it only matters to people (like me) that already have a good 7000 series card. If I didn't have one and was looking for a new card I would have no problem picking up a 9000 series card as long as the price wasn't ridiculous.

[–] Baggie@lemmy.zip 3 points 2 days ago

Can confirm, mine runs at like 50° at all times and I'm not certain what black magic it does to manage this.