this post was submitted on 27 Oct 2025
56 points (93.8% liked)
Technology
40598 readers
238 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I went to 4K monitors many years ago and agree there is a big difference, that said there are a lot of issues with HiDPI monitors on desktop OSs, plus it requires a strong GPU for gaming. Since then I've gone to 1440p and think it is an ideal middle ground for performance while looking almost just as sharp.
It really depends. I managed to play Cyberpunk 2077 at 4k on my old 1080ti with 30-50fps, and thanks to having a nice monitor and knowing how to tune graphics settings, my roomie that had a 2080 gtx was complaining that mine looked better. lol
Anything that's not crazy levels of fidelity can be tuned to work totally fine at 4k on modern graphics cards. Every generation (that has more than 8gb of gRAM, anyways...) since the 1080 has been more than capable of 4k gaming with some settings tweaks.