Ironic the image is of a switch, like Nintendo has been on the cutting edge at all in the last 20+ years
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
So now we can finally go back to good old code optimization, right? Right? (Padme.jpg)
We'll ask AI to make it performant, and when it breaks, we'll just go back to the old version. No way in hell we are paying someone
Damn. I hate how it hurts to know that's what will happen
It’s not that they’re not improving like they used to, it’s that the die can’t shrink any more.
Price cuts and “slim” models used to be possible due to die shrinks. A console might have released on 100nm, and then a process improvement comes out that means it can be made on 50nm, meaning 2x as many chips on a wafer and half the power usage and heat generation. This allowed smaller and cheaper revisions.
Now that the current ones are already on like 4nm, there’s just nowhere to shrink to.
Not to mention that even when some components do shrink, it's not uniform for all components on the chip, so they can't just do 1:1 layout shrinks like in the past, but pretty much need to start the physical design portion all over with a new layout and timings (which then cascade out into many other required changes).
Porting to a new process node (even at the same foundry company) isn't quite as much work as a new project, but it's close.
Same thing applies to changing to a new foundry company, for all of those wondering why chip designers don't just switch some production from TSMC to Samsung or Intel since TSMC's production is sold out. It's almost as much work as just making a new chip, plus performance and efficiency would be very different depending in where the chip was made.
Which itself is a gimmick, they've just made the gates taller, electron leakage would happen otherwise.
NM has been a marketing gimmick since Intel launched their long-standing 14nm node. Actual transistor density depending on which fab you compare to is shambles.
It's now a title / name of a process and not representative of how small the transistors are.
I've not paid for a CPU upgrade since 2020, and before that I was using a 22nm CPU from 2014. The market isn't exciting (to me anymore), I don't even want to talk about the GPUs.
Back in the late 90s or early 2000s upgrades felt substantial and exciting, now it's all same-same with some minor power efficiency gains.
Now, maybe, but like I said - in the past this WAS what let consoles get big price cuts and size revisions. We’re not talking about since 2020, we’re talking about things like the PS -> PSOne, PS2 - PS2 Slim.
This is why I'm more than happy with my 5800X3D/7900XTX; I know they'll perform like a dream for years to come. The games I play run beautifully on this hardware under Linux (BeamNG.Drive runs faster than on Windows 10), and I have no interest in upgrading the hardware any time soon.
Hell, the 4790k/750Ti system I built back in 2015 was still a beast in 2021, and if my ex hadn't gotten it in the divorce (I built it specifically for her, so I didn't lose any sleep over it), a 1080Ti upgrade would have made it a solid machine for 2025. But here we are - my PC now was a post-divorce gift for myself. Worth every penny. PC and divorce.
There’s no world in which a 750Ti or even 1080Ti is a “solid machine” for gaming in 2025 lol.
Depends on your expectations. If you okay mainly eSports titles at 1080p it would've probably been quite sufficient still.
But I agree it's a stretch as an all-rounder system in 2025. My 3090 is already showing signs of it's age, a card that's two generations older would certainly be struggling today.
For what I do? It would be perfectly fine. Maybe not for AAA games, but for regular shit at ~40fps and 1080p, it would be perfectly fine.
Gotta remember that some of us are reaching 40 years old, with kids, and don't really give a shit about maxing out the 1% lows.
This article doesn't factor in the new demand that is gobbling up all the CPU and GPU production: Ai server farms. For example, Nvidia, that was once only making graphic cards for gamers, has been trying to keep up with global demand for Ai. The whole market is different, then toss tarrifs and the rest of top.
I wouldn't blame moores law death, technology is still advancing, but per usual, based on demand.
technology is still advancing
Actually not really: performance per watt of the high end stuff has been stagnating since Ampere generation. NVidia hides it by changing models in its benchmarks or advertising raw performance without power figures.
Idk, seems like Germany is making progress.
AI has nothing to do with it. Die shrinks were the reason for “slim” consoles and big price drops in the past. Die shrinks are basically a thing of the past now.
Consoles are just increasingly bad value for consumers compared to PCs.
Are they tho? Have you seen graphics card prices?
My 4070 cost $300 and runs everything.
The whole PC cost around $1000, and i have had it since the Xbox One released.
You can get similar performance from a $400 steam deck which is a computer.
You don't need a top end card to match console specs, something like a 6650XT or 6700XT is probably enough. Your initial PC build will be more than a console by about 2X if you're matching specs (maybe 3X if you need a monitor, keyboard, etc), but you'll make it up with access to cheaper games and being able to upgrade the PC without replacing it, not to mention the added utiliy a PC provides.
So yeah, think of PC vs console as an investment into a platform.
If you only want to play 1-2 games, console may be a better option. But if you're interested in older or indie games, a PC is essential.
2060 super for 300, and then another 200 for a decent processor puts you ahead of a ps5 and for a comparable price. Games are cheaper on PC too, as well as a broader selection. https://pcpartpicker.com/list/zYGmJn here is a mid tier build for 850, you could cut the procesor down, install linux for free, and im sure youve got a computer monitor laying around somwhere... the only thing stopping you is inertia.
2060 super for 300, and then another 200 for a decent processor puts you ahead of a ps5 and for a comparable price.
you're going to have to really scrunge up for deals in order to get psu, storage, memory, motherboard, and a case for your remaining budget of $0.
https://pcpartpicker.com/list/zYGmJn here is a mid tier build for 850
This is $150 more expensive and the gpu is half as performant as the reported PS5 pro equivalent.
Ok so, for starters, your 'reported equivalent' source is wrong.
The custom AMD Zen2 APU (combined CPU + GPU, as is done in laptops) of a PS5Pro is 16.7 TFLOPs, not 33.
So your PS5 Pro is actually roughly equivalent to that posted build.... by your 'methodology', which is utterly unclear to me, what your actual methodolgy for doing a performance comparison is.
The PS5 Pro uses 2 GB of DDR5 RAM, and 16 GB of GDDR6 RAM.
This is... wildly outside of the realm of being directly comparable to a normal desktop PC, which ... bare minimum these days, has 16 GB DDR4/5 RAM, and the GDDR6 RAM would be part of the detachable GPU board itself, and would be ... between 8GB ... and all the way up to 32 if you get an Nvidia 5090, but consensus seems to be that 16 GB GDDR6/7 is probably what you want as a minimum, unless you want to be very reliant on AI upscaling/framegen, and the input lag and whatnot that comes with using that on an underpowered GPU.
Short version: The PS5Pro would be a wildly lopsided, nonsensical architecture to try to one to one replicate in a desktop PC.... 2 GB system RAM will run lightweight linux os's, but not a chance in hell you could run Windows 10 or 11 on that.
Fuck, even getting 7 to work with 2GB RAM would be quite a challenge... if not impossible, I think 7 required 4GB RAM minimum?
The closest AMD chip to the PS5 Pro that I see, in terms of TFLOP output... is the Radeon 7600 Mobile.
((... This is probably why Cyberpunk 2077 did not (and will never) get a 'performance patch' for the PS5Pro: CP77 can only pull both high (by console standards) framerates at high resolutions... and raytracing/path tracing... on Nvidia mobile class hardware, which the PS5Pro doesn't use.))
But, lets use the PS5Pro's ability to run CP77 at 2K60fps on ... what PC players recognize as a mix of medium and high settings... as our benchmark for a comparable standard PC build. Lets be nice and just say its the high preset.
(a bunch of web searching and performance comparisons later...)
Well... actually, the problem is that basically, nobody makes or sells desktop GPUs that are so underpowered anymore, you'd have to go to the used market or find some old unpurchased stock someone has had lying around for years.
The RX 6600 in the partpicker list is fairly close in terms of GPU performance.
Maybe pair it with an AMD 5600X processor if you... can find one? Or a 4800S, which supposedly actually were just rejects/run off from the PS5 and Xbox X and S chips, rofl?
Yeah, legitimately, the problem with trying to make a PC ... in 2025, to the performance specs of a PS5 Pro... is that basically the bare minimum models for current and last gen, standard PC architecture... yeah they just don't even make hardware that weak anymore.
EDIT:
oh final addendum: if your tv has an hdmi port, kablamo, thats your monitor, you dont strictly need a new one.
And there are also many ways to get a wireless or wired console style controller to work in a couch pc setup.
Is it Moores law failing or have we finally reached the point where capitalists are not even pretending to advance technology in order to charge higher prices? Like are we actually not able to make things faster and cheaper anymore or is the market controlled by a monopoly that sees no benefit in significantly improving their products? My opinion has been leaning more and more towards the latter since the pandemic.
This has little to do with "capitalists" and everything to do with the fact that we've basically reached the limit of silicon.
Moore's law started failing in 2000, when single core speeds peaked, leading to multi core processors since. Memory and storage still had ways to go. Now, the current 5nm process is very close to the limits imposed by the laws of physics, both in how small a laser beam can be and how small a controlled chemical reaction can be done. Unless someone can figure a way to make the whole chip fabrication process in less steps, or with higher yield, or with cheaper machines or materials, even if at 50nm or larger, don't expect prices to drop.
Granted, if TSMC stopped working in Taiwan, we'd be looking at roughly 70% of all production going poof, so that can be considered a monopoly (it is also their main defense against China, the "Silicon Shield", so there's more than just capitalistic greed at play for them)
https://www.youtube.com/watch?v=po-nlRUQkbI - How are Microchips Made? 🖥️🛠️ CPU Manufacturing Process Steps | Branch Education
Very interesting! I was aware of the 5nm advancements and the limitations of chip sizes approaching the physical limitations of the material but I had been assuming since we worked around the single core issue a similar innovation would appear for this bottleneck. It seems like the focus instead was turned towards integrating AI into the gpu architecture and cranking up the power consumption for marginal gains in performance instead of working towards a paradigm shift. Thanks for the in depth explanation though, I always appreciate an opportunity to learn more about this type of stuff!
No, it turns out that lying to the consumer about old tech is profitable.
Hatebait. Adds nothing informative to the thread.