this post was submitted on 28 Feb 2025
638 points (96.2% liked)

Gaming

3792 readers
843 users here now

!gaming is a community for gaming noobs through gaming aficionados. Unlike !games, we don’t take ourselves quite as serious. Shitposts and memes are welcome.

Our Rules:

1. Keep it civil.


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only.


2. No sexism, racism, homophobia, transphobia or any other flavor of bigotry.


I should not need to explain this one.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Try not to repost anything posted within the past month.


Beyond that, go for it. Not everyone is on every site all the time.



Logo uses joystick by liftarn

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] ICastFist@programming.dev 2 points 5 hours ago

What those leaps do result in, however, is major performance gains.

Which many devs will make sure you never feel them by "optimizing" the game for only the most bleeding edge hardware

Then there’s efficiency. What if you could run Monster Hunter Wilds at max graphics, on battery, for hours? The first gen M1 Max MacBook Pro can comfortably run Baldur’s Gate III. Reducing power draw would have immense benefits on top of graphical improvements.

See, if the games were made with a performance first mindset, that'd be possible already. Not to dunk on performance gains, but there's a saying that every time hardware gets faster, programmers make their code slower. I mean, you can totally play emulated SNES games with minimal impact compared to leaving the computer idling.

Saying “diminishing returns” is like saying that fire burns you when you touch it.

Unless chip fabrication can figure a way to make transistors "stack" on top of one another, effectively making 3D chips, they'll continue to be "flat" sheets that can only increase core count horizontally. Single core frequency peaked in early 2000s, from then on it's been about adding more cores. Even the gains from a RTX 5090 vs a RTX 4090 aren't that big. Now compare with the gains from a GTX 980 vs a GTX 1080