Some people still say that 1080p is plenty, whether for reading text or watching videos (including gaming), but anyone who has used a 4K monitor knows that text looks far clearer and games look far more realistic and detailed at 4K. And the same will probably be true when 4K becomes truly mainstream on desktops.
> but anyone who has used a 4K monitor knows that text looks far clearer and games look far more realistic and detailed at 4K
It depends on the distance really.
Text at a desktop at an arms distance max, possibly on 24"+ will be noticeably better.
I have a 34" inches ultra wide 1440p and I definitely would love higher pixel density.
But start getting 24" or less and 1440p vs 4k are borderline marketing.
People do swear to be able to see the difference, yet I remember they random tested some 120+ gamers who were shown the same TV with different output res, and the distribution of guesses had a very minor slight advantage for the 4k, well in the realm of errors, and it obviously dropped to non existence with just few centimeters of distance more.
It also heavily depends on the content being displayed. You might not be able to tell any difference in a typical game frame or a movie, but a 1px black on white line still activates your cones while being way below the minimum angular resolution you can see. You can see stars for the same reason.
I see easily a difference on text rendering on a mac os retina display (3k on 14") vs. 4k 27" oled. 1080p is horrible obvious on basically every display size i own.
And my smartphone or my DSLR take pictures with more than 4k, you can see the image sharpness difference also immediately.
What content is left? Games, but they also have text. Higher resolution, less antialising needed.
The most frustrating thing about this type of discussion is: Its 2025! 4k displays exist for how much in consumer space? 10 years by now?
8k is also nice. Might actually be something i don't pay extra for, but supporting 8k opens up so many possibilities. Like controling 4 4k displays without issues (demo walls, event setups etc.) or just 2 or 3 24/27" displays on your desk.
Resolution allows you to pick up different image features, it's not infinite. A zebra pattern will eventually merge into a solid gray color upon shrinking, that's your actual resolution limit. But a single dot or a line surrounded by enough background will still be visible way below this limit because it has enough energy difference to activate a photoreceptor, even though you won't be able to tell its true size or thickness.
Our resolution varies across our FOV, but is best measured in the center of vision. (Ironically, we can potentially see smaller objects away from the center, but... biology is analog, not digital, and weird.)
Focal length is merely the conversion factor from physical resolution at the receiver side to angular resolution in the real world. It varies, but not by much, eye to eye, with vision aids (glasses, etc.), and (very trivially) with distance to the object. So, it's basically a constant.
I’m pretty sure you just confirmed that focal length varies. So I stand by my statement. Our rods and cones enable us to see great detail and yes, it’s more concentrated in the center of field of view, but if we’re talking pixels… 567MP is what people claim. I know it’s a lot higher. Pilots are able to spot specks at 4nm. Astronauts can see objects that are just a speck at 50nm. Our actual eye resolution is insane when paired with the right focal length.
I wrote a longer post, but I'll keep it short: you can't compare 100 small res cameras + an inference engine that combines the samples together (our brain with low res foveal samples) on the same metrics you'd judge a single camera with 100x more res. They simply see, resolve and process things differently.
To stick with your example: the pilots that "see" aircraft at 4 nautical miles don't "see" them because their foveas have super human pixel density. But because high-contrast very low res foveal samples + contextual inference + motion + time, give more time to the brain to infer the presence of the aircraft. But if you had the ability to stop the motion and image the very same aircraft would "disappear".
I don't think it's actually infinit, but it's probably very high! We do have little receptors (cones) in our eyes (retina) that react chemically to light. So the density is the resolution (and different cone types react to different wavelengths, which is why some people are coloured blind, for example)!
- we are all colorblind, as we only see a very small part of the light spectrum
- there are no two individuals that perceive colors in the same way in the world as the receptor distribution changes by person to person (actually, even day by day on a single-individual basis)
I personally struggle at 1440p vs 4k as soon as you get at 27" mark, but I'm generally at a 70/80 centimeters (2.5 feet in american units) distance from my screen.
This is hard to gauge though because it's rarely only a change from 1080p to 4k. The screen tech often changes from LCD to LED (or microLED, or OLED), there's all manner of motion smoothing added, sometimes higher refresh rates, and so on.
Anecdotally I was at a family Christmas where relatives were watching some home videos that were encoded at 480p on a flash drive on a new TV, and they all said that the quality of the 4K picture was amazing despite the fact they were all watching 480P video (without upscaling because I'd turned it off.) To be fair it did look better than an old TV, but not because of the resolution.
I have a 34" 4K TV and from the couchthere is a difference between 1080p and 4K on it. 4K is crisper whether on Netflix or YouTube. Only potential variable, I think is the codec and compression used on each resolution.
I find it fascinating that the same is true for frame rate. Some people think 60Hz is OK, while anyone who has tried a 120Hz screen will agree it is infinitely smoother. The same is true again for a 240Hz screen. I have yet to try a 480Hz screen but imagine the jump will be equally impressive.
Yeah, I think diminishing returns kick in at some point.
Going from 1080p to 1440p feels like a huge improvement.
Going from 1440p to 4k (aka 2160p) is a little bit sharper.
I don't think the jump from 4k to 8k will improve things that much.
I can tell the difference between 1080p (or upscaled 1080p) and 4k on a 50" screen at "living room" distances, but it's nowhere near as obvious as SD to DVD was.
At "laptop" screen distances the difference between my Retina display and non-retina external monitors is quite noticeable; so much so that I run 4k in 1080p mode more and more.
8k is going to require those curved monitors because you'll have to be that close to it to get the advantage.
> I can tell the difference between 1080p (or upscaled 1080p) and 4k
Are you talking about the resolution of the video or of the screen itself? Lower resolution video looks worse also because of the compression. I saw bigger difference from video compression than from screen resolution. E.g. good 1080p video looked better than bad 1080p video on any screen and resolution.
I have a 43" 4k monitor at ~1m distance and when I use the computer set up with 100% scaling it looks bad, I see pixelation... and "subpixelation". On a different computer connected to the same screen but something like 150% scaling it's like night and day difference. Everything looks smooth, perfect antialiasing.
This is the money picture [0]. Above a certain distance any improvement is imperceptible. But don't compare compressed video on a screen, it will add quality issues that influence your perception.
I remember those numbers being a third that 20 years ago. Either we have evolved brand new eyes without noticing or you are just talking about the current state of the art like it's the limit of human vision.
Put another way look at 300ppi prints and 1200ppi prints. The difference is night and day at 30 cm viewing.
For printing photos, you are right that a 300 ppi printer is better than an 1200 dpi printer.
On the other hand, for printing text, an 1200 dpi printer has the quality of an 1200 ppi printer.
Many well-designed traditional typefaces have relied on optical effects caused by details that require for being printed a resolution higher than that at which the human eye can distinguish a set of bars from an uniform background (which is the object of TFA). For instance, in some typefaces the edges of the strokes are concave or convex, not straight, which could be rendered in a computer display only by either a much higher resolution or by more sophisticated pixel preprocessing methods (in order to simulate the effect on the eye). Whenever such typefaces are displayed at a lower resolution, i.e. on computer monitors, they are very noticeably uglier than when printed on paper by traditional metal printing methods or even by a high-resolution laser printer.
Higher ppi on mobile is still useful if it enables “manual zoom” (i.e. move your head closer). I do this with Google Sheets on mobile all the time, as I like to have a lot of the sheet displayed at once to see the overall structure, and then peer closer to read the text.
You want to say "the minimum that is useful", because you want a resolution at least equal with that, to not see the pixel structure.
A 27" monitor has a height around 17", i.e. about 43 cm, and for watching a movie or anything else where you look at the screen as a whole the recommended viewing distance is twice the screen height, i.e. about 86 cm.
At this distance, the resolution needed to match the human vision is provided by a height of slightly less than 3000 pixels by this study, but by about 3300 pixels by older studies. In these conditions you are right, the minimum acceptable resolution is around 200 ppi.
This means that a 27 inch 5k monitor, with a resolution of 2880 by 5120 pixels, when viewed from a distance twice its height, i.e. about 86 cm (34 inch), provides a resolution close, but slightly less than that of typical human vision. (That viewing distance that is double the height corresponds to the viewing angle of camera lenses with normal focal length, which has been based on studies about the maximum viewing angles where humans are able to perceive a correct perspective when looking at an image as a whole.)
However, when not watching movies, but working with text documents, you normally stay closer to the monitor than that, so even a 5k monitor is not good enough (but an 8k monitor may be enough, so that might be the final monitor resolution, beyond which an increase is useless).
About 15 cm view distance for smartphones is pretty normal for me (shortsighted with glasses) on websites where the text is very small, e.g. Hacker News.
Oh, that's a great idea. I see Firefox also has this in the settings under Accessibility. I guess I never looked there because I don't think of myself as disabled. (Maybe I should.)
More and more - especially on Apple products - Accessibility is the "customization" menu, not the "disabled support" it used to be.
Size increases, animation decreases (when your iPhone is getting old, turn off animation and behold it operating super fast!), etc can all be found there.
I zoom websites until they "feel right" which is usually something close to "they are as wide as the window I have them in" - HN is a few taps up from "actual size".
I would argue that color accuracy is under valued.
Having a high resolution monitor is great, its even become semi affordable. But color accuracy can be a pain point.
This dosnt matter if your just coding, unless you really care how realistic that code looks....but to anyone doing video or photo manipulation or front end design it can be a deal breaker.
I bought a while back a calibration device and calibrated my displays.
I always buy good displays (ISP, high resolution, good test results) and the calibration was very very good out of the box. I sold the calibration device after.
If you don't have to provide pin point accuracy from display to the industry level printer with color profile, you will not need this if you don't buy cheap.
One would expect the results to be highly correlated to corrected vision which is all over the place.. but they get suspiciously tightly grouped results.
Did they maybe not measure how many pixels we can see.. but rather how laughably bad COTS IPS are at contrast, as the examined pattern approaches their resolution? I wonder what happens if you repeat that with a reasonably bright 16K OLED.
Some people still say that 1080p is plenty, whether for reading text or watching videos (including gaming), but anyone who has used a 4K monitor knows that text looks far clearer and games look far more realistic and detailed at 4K. And the same will probably be true when 4K becomes truly mainstream on desktops.
> but anyone who has used a 4K monitor knows that text looks far clearer and games look far more realistic and detailed at 4K
It depends on the distance really.
Text at a desktop at an arms distance max, possibly on 24"+ will be noticeably better.
I have a 34" inches ultra wide 1440p and I definitely would love higher pixel density.
But start getting 24" or less and 1440p vs 4k are borderline marketing.
People do swear to be able to see the difference, yet I remember they random tested some 120+ gamers who were shown the same TV with different output res, and the distribution of guesses had a very minor slight advantage for the 4k, well in the realm of errors, and it obviously dropped to non existence with just few centimeters of distance more.
It also heavily depends on the content being displayed. You might not be able to tell any difference in a typical game frame or a movie, but a 1px black on white line still activates your cones while being way below the minimum angular resolution you can see. You can see stars for the same reason.
I see easily a difference on text rendering on a mac os retina display (3k on 14") vs. 4k 27" oled. 1080p is horrible obvious on basically every display size i own.
And my smartphone or my DSLR take pictures with more than 4k, you can see the image sharpness difference also immediately.
What content is left? Games, but they also have text. Higher resolution, less antialising needed.
The most frustrating thing about this type of discussion is: Its 2025! 4k displays exist for how much in consumer space? 10 years by now?
8k is also nice. Might actually be something i don't pay extra for, but supporting 8k opens up so many possibilities. Like controling 4 4k displays without issues (demo walls, event setups etc.) or just 2 or 3 24/27" displays on your desk.
People forget this. Our eye resolution is infinity. Our focal length is different.
Resolution allows you to pick up different image features, it's not infinite. A zebra pattern will eventually merge into a solid gray color upon shrinking, that's your actual resolution limit. But a single dot or a line surrounded by enough background will still be visible way below this limit because it has enough energy difference to activate a photoreceptor, even though you won't be able to tell its true size or thickness.
There is also overlap between these ranges. At some distance you can choose whether you want to perceive a zebra pattern as gray or as pattern.
You seem to not understand those terms at all.
Our resolution varies across our FOV, but is best measured in the center of vision. (Ironically, we can potentially see smaller objects away from the center, but... biology is analog, not digital, and weird.)
Focal length is merely the conversion factor from physical resolution at the receiver side to angular resolution in the real world. It varies, but not by much, eye to eye, with vision aids (glasses, etc.), and (very trivially) with distance to the object. So, it's basically a constant.
I’m pretty sure you just confirmed that focal length varies. So I stand by my statement. Our rods and cones enable us to see great detail and yes, it’s more concentrated in the center of field of view, but if we’re talking pixels… 567MP is what people claim. I know it’s a lot higher. Pilots are able to spot specks at 4nm. Astronauts can see objects that are just a speck at 50nm. Our actual eye resolution is insane when paired with the right focal length.
I wrote a longer post, but I'll keep it short: you can't compare 100 small res cameras + an inference engine that combines the samples together (our brain with low res foveal samples) on the same metrics you'd judge a single camera with 100x more res. They simply see, resolve and process things differently.
To stick with your example: the pilots that "see" aircraft at 4 nautical miles don't "see" them because their foveas have super human pixel density. But because high-contrast very low res foveal samples + contextual inference + motion + time, give more time to the brain to infer the presence of the aircraft. But if you had the ability to stop the motion and image the very same aircraft would "disappear".
I don't think it's actually infinit, but it's probably very high! We do have little receptors (cones) in our eyes (retina) that react chemically to light. So the density is the resolution (and different cone types react to different wavelengths, which is why some people are coloured blind, for example)!
Well, technically speaking:
- we are all colorblind, as we only see a very small part of the light spectrum
- there are no two individuals that perceive colors in the same way in the world as the receptor distribution changes by person to person (actually, even day by day on a single-individual basis)
Maybe for gaming but at 24 inches for desktop monitor I see a sizeable difference for text rendering.
Looks like good anti-aliasing for text to look better on lower DPI display is slowly getting the bitrot treatment...
I personally struggle at 1440p vs 4k as soon as you get at 27" mark, but I'm generally at a 70/80 centimeters (2.5 feet in american units) distance from my screen.
This is hard to gauge though because it's rarely only a change from 1080p to 4k. The screen tech often changes from LCD to LED (or microLED, or OLED), there's all manner of motion smoothing added, sometimes higher refresh rates, and so on.
Anecdotally I was at a family Christmas where relatives were watching some home videos that were encoded at 480p on a flash drive on a new TV, and they all said that the quality of the 4K picture was amazing despite the fact they were all watching 480P video (without upscaling because I'd turned it off.) To be fair it did look better than an old TV, but not because of the resolution.
I have a 34" 4K TV and from the couchthere is a difference between 1080p and 4K on it. 4K is crisper whether on Netflix or YouTube. Only potential variable, I think is the codec and compression used on each resolution.
There is less blending at the edges of high contrast, so some edges wil likely seem sharper.
I find it fascinating that the same is true for frame rate. Some people think 60Hz is OK, while anyone who has tried a 120Hz screen will agree it is infinitely smoother. The same is true again for a 240Hz screen. I have yet to try a 480Hz screen but imagine the jump will be equally impressive.
i was going to say it likely would not be as impressive because the frame times would decrease by a lesser fraction, but then i found this article, interesting: https://blurbusters.com/massive-upgrade-with-120-vs-480-hz-o...
Actually, just going from 1080 to QHD (1200?? I don't remember) has made my work so much pleasant.
Yeah, I think diminishing returns kick in at some point.
Going from 1080p to 1440p feels like a huge improvement. Going from 1440p to 4k (aka 2160p) is a little bit sharper. I don't think the jump from 4k to 8k will improve things that much.
I can tell the difference between 1080p (or upscaled 1080p) and 4k on a 50" screen at "living room" distances, but it's nowhere near as obvious as SD to DVD was.
At "laptop" screen distances the difference between my Retina display and non-retina external monitors is quite noticeable; so much so that I run 4k in 1080p mode more and more.
8k is going to require those curved monitors because you'll have to be that close to it to get the advantage.
> I can tell the difference between 1080p (or upscaled 1080p) and 4k
Are you talking about the resolution of the video or of the screen itself? Lower resolution video looks worse also because of the compression. I saw bigger difference from video compression than from screen resolution. E.g. good 1080p video looked better than bad 1080p video on any screen and resolution.
I have a 43" 4k monitor at ~1m distance and when I use the computer set up with 100% scaling it looks bad, I see pixelation... and "subpixelation". On a different computer connected to the same screen but something like 150% scaling it's like night and day difference. Everything looks smooth, perfect antialiasing.
This is the money picture [0]. Above a certain distance any improvement is imperceptible. But don't compare compressed video on a screen, it will add quality issues that influence your perception.
[0] https://media.springernature.com/full/springer-static/image/...
It’s most noticeable with animation that has text-like lines, in my experience.
For plain video, 1080p at high enough bitrate is fine.
DVDs are SD, with 480 pixels of resolution for ntsc.
Yeah I meant VHS → DVD (which was noticeable), and also SD → HD (1080p) which was again noticeable.
Probably not for 32” monitor, but I think 8k would be noticeably better for a 43”.
Vision Pro — for me — differentiates itself from all other screens in its ability to render closeups of the human face. The 6k detail is staggering.
This is helpful for estimating how much higher display resolutions are still perceivable and when we enter the territory of "marketing bullshit".
For example:
- 40 cm view distance (e.g. smartphone): 300 ppi is roughly the maximum that's useful
- 100 cm (e.g. desktop monitor): about 200 ppi
https://www.nature.com/articles/s41467-025-64679-2/figures/2
I remember those numbers being a third that 20 years ago. Either we have evolved brand new eyes without noticing or you are just talking about the current state of the art like it's the limit of human vision.
Put another way look at 300ppi prints and 1200ppi prints. The difference is night and day at 30 cm viewing.
ppi pixels ≠ dpi dots
You don't need 1200ppi for a nice 1200dpi print; even 300ppi may be enough.
For printing photos, you are right that a 300 ppi printer is better than an 1200 dpi printer.
On the other hand, for printing text, an 1200 dpi printer has the quality of an 1200 ppi printer.
Many well-designed traditional typefaces have relied on optical effects caused by details that require for being printed a resolution higher than that at which the human eye can distinguish a set of bars from an uniform background (which is the object of TFA). For instance, in some typefaces the edges of the strokes are concave or convex, not straight, which could be rendered in a computer display only by either a much higher resolution or by more sophisticated pixel preprocessing methods (in order to simulate the effect on the eye). Whenever such typefaces are displayed at a lower resolution, i.e. on computer monitors, they are very noticeably uglier than when printed on paper by traditional metal printing methods or even by a high-resolution laser printer.
Higher ppi on mobile is still useful if it enables “manual zoom” (i.e. move your head closer). I do this with Google Sheets on mobile all the time, as I like to have a lot of the sheet displayed at once to see the overall structure, and then peer closer to read the text.
You want to say "the minimum that is useful", because you want a resolution at least equal with that, to not see the pixel structure.
A 27" monitor has a height around 17", i.e. about 43 cm, and for watching a movie or anything else where you look at the screen as a whole the recommended viewing distance is twice the screen height, i.e. about 86 cm.
At this distance, the resolution needed to match the human vision is provided by a height of slightly less than 3000 pixels by this study, but by about 3300 pixels by older studies. In these conditions you are right, the minimum acceptable resolution is around 200 ppi.
This means that a 27 inch 5k monitor, with a resolution of 2880 by 5120 pixels, when viewed from a distance twice its height, i.e. about 86 cm (34 inch), provides a resolution close, but slightly less than that of typical human vision. (That viewing distance that is double the height corresponds to the viewing angle of camera lenses with normal focal length, which has been based on studies about the maximum viewing angles where humans are able to perceive a correct perspective when looking at an image as a whole.)
However, when not watching movies, but working with text documents, you normally stay closer to the monitor than that, so even a 5k monitor is not good enough (but an 8k monitor may be enough, so that might be the final monitor resolution, beyond which an increase is useless).
I use my phone way closer than 40cm all the time...
You may have untreated myopia, or need to use a bigger font size (HN is guilty of this!)
No I just sometimes use my phone while lying down lol
About 15 cm view distance for smartphones is pretty normal for me (shortsighted with glasses) on websites where the text is very small, e.g. Hacker News.
I’ve started using text scaling on HN (Cmd++ on Mac, available on mobile, too) and it’s much easier to read and pleasant to the eye this way.
Oh, that's a great idea. I see Firefox also has this in the settings under Accessibility. I guess I never looked there because I don't think of myself as disabled. (Maybe I should.)
More and more - especially on Apple products - Accessibility is the "customization" menu, not the "disabled support" it used to be.
Size increases, animation decreases (when your iPhone is getting old, turn off animation and behold it operating super fast!), etc can all be found there.
I zoom websites until they "feel right" which is usually something close to "they are as wide as the window I have them in" - HN is a few taps up from "actual size".
given enough time we all become disabled
I would argue that color accuracy is under valued.
Having a high resolution monitor is great, its even become semi affordable. But color accuracy can be a pain point.
This dosnt matter if your just coding, unless you really care how realistic that code looks....but to anyone doing video or photo manipulation or front end design it can be a deal breaker.
I bought a while back a calibration device and calibrated my displays.
I always buy good displays (ISP, high resolution, good test results) and the calibration was very very good out of the box. I sold the calibration device after.
If you don't have to provide pin point accuracy from display to the industry level printer with color profile, you will not need this if you don't buy cheap.
There's this classic by VSauce about the same topic: https://www.youtube.com/watch?v=4I5Q3UXkGd0
I hate the fact that my very first reaction is: "That just got bulldozed."
That gives 3 pixels per resolution element (assuming diffraction limit for 1cm diameter pupil). That sounds about right (see Nyquist frequency).
One would expect the results to be highly correlated to corrected vision which is all over the place.. but they get suspiciously tightly grouped results.
Did they maybe not measure how many pixels we can see.. but rather how laughably bad COTS IPS are at contrast, as the examined pattern approaches their resolution? I wonder what happens if you repeat that with a reasonably bright 16K OLED.