Are you commenting on the fact that the chart changes as the TV size changes? If so, I think I understand what you're getting hung up on.Strazdas said:Sigh. I think you are misunderstanding me. Ill try to use paint, maybe that will be more clear.Lightknight said:No, it doesn't. The areas outside of your focal point are blurry in real life too. The reason why there's a blur outside of the focal point is because we lose focus the further away we go from the focal point. That's literally why it's called the focal point. Because that's where you're focusing your sight and everything else is out of focus (aka, blurry).
The focal point is the point that can resolve images the best. So... who gives a shit about everything outside of the focal point since if you succeed at catering to the focal point, everything else is as accommodated as it's going to get.
Again, here is how it goes:
1. If you can't resolve pixels with your focal point, you can't resolve pixels with non-focal vision.
2. If you can't resolve pixels with your non-focal vision, you may STILL be able to resolve pixels with your focal point.
So again, why do you think non-focal vision is relevant to this discussion? It's the only thing that matters when discussing the resolution of the screen. In what way do you think people would do anything differently to account for the parts of your vision that you're not focusing on?
![]()
Do you see why this graph is unrealistic information now? Do you see why i mentioned peripheral vision where majority of the screen remains at?
The chart isn't making some sort of claim that you actually look at the entire screen as focus. As the size of the TV changes the pixel density changes. So if you had a 1080p 100" TV then it wouldn't look as crisp if you sat the same distance away from it as a 1080p 55" TV. This is because the larger the TV, the loosely packed those 1920 pixels by 1080 pixels are arranged and so are easier to resolve by the human eye.
http://teknosrc.com/resolution-vs-pixel-density-in-displays-all-you-need-to-know/
Pixel Density= Root((Horizontal Number of Pixel^2) + (Vertical Number of Pixel^2))/Screen Size
Basically, as the screen increases while the resolution remains the same, the Pixel density decreases. So it will make a difference and that's why there is a sliding scale as the TV gets larger rather than a straight line like you may have suspected. Smaller TVs with higher resolutions have SOOOO many pixels crammed in there that you have to get a LOT closer to be able to tell them apart.
PPI (pixel density or pixels per inch as the acronym means) changes directly with resolution and TV size because of what resolution is. Number of pixels high and number of pixels across.
So a typical 1920 by 1080 resolution will have 1920 by 1080 regardless of the inches of the TV. So imagine the difference in PPI on a 55" TV with that many pixels across and high compared to a 32" TV with that many pixels across and high. The 32" TV has them packed tighter together and they may even be smaller.
Does that, perhaps, clear up our disagreement? The chart uses resolutions and TV size instead of PPI because those qualifiers are more meaningful to us as consumers. Calculating the PPI isn't something everyone knows how to do and PPI isn't listed on TVs being sold as far as I've seen. So resolution and screen size is good enough even though what we're secretly discussing is PPI.