"No duh" -Most humans, since ever
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
8k is a little high. I feel like 4k is a significant change from 1080p, especially if you use your screen as a computer monitor.
Yes, as a monitor, if it's over 30" or so, 4k makes sense. If it's a TV, 4k doesn't make much difference given how far most sit from their TV. Maybe if it's a massive TV or something at like 80"...
Me getting 480p videos for my video projector : "Oh... no really?" ¯\(ツ)/¯
PS: FWIW I do have a Vision Pro (for work, I didn't pay for it personally) so I technically could enjoy high res content... but honestly I can't bother using this to watch videos. I'm fine with just my desktop screen or video projector. I just don't get the high res.
Anecdotally at average viewing distances on my 55" TV I can't really tell a difference. If I had an enormous TV maybe I would be able to tell. 1080 > 2160 is for sure not the leap 720 > 1080, or 480 > 720 was in the average environment that's for sure.
If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish. The scientists made it crystal clear: once your setup hits that threshold, any further increase in pixel count, like moving from 4K to an 8K model of the same size and distance, hits the law of diminishing returns because your eye simply can't detect the added detail.
I commend them on their study of human eye "pixels-per-degree" perception resolution limit, but there are some caveats to the article title and their findings.
First of all, nobody recommends a 44-inch TV for 2.5 metres, I watch from the same distance and I think the minimum recommended 4k TV size for that distance was 55 inches.
Second, I'm not sure many QHD TVs are being offered, market mostly offers 4k or 1080p TVs, QHDs would be a small percentage.
And QHDs are already pretty noticable quality jump over 1080p, I've noticed on my gaming rig. So basically if you do the jump from 1080p to 4K, and watch 4k quality content, from the right distance - most people are absolutely gonna notice that quality difference.
For 8Ks I don't know, you probably do get into diminishing returns there unless you have a wall-sized TV or watch it from very close.
But yeah, clickbaity titled article, mostly.
Really depends on the size of the screen, the viewing distance, and your age/eye condition. For more people 720 or 1080 is just fine. With 4k, you will get some better detail on the fabric on clothes and environments, but not a huge difference.
8k is gonna be a huge waste and will fail.
Given how much time I spend actually looking at the screen while the show/movie is on, it might as well be in ca. 2000 RealVideo 160x120 resolution.
It depends on how far away you sit. But streaming has taken over everything and even a little compression ruins the perceived image quality of a higher-DPI display.