[citation][nom]randomoneh[/nom]Both of you have very bad understanding of how this works.Your example - 40 degrees horizontally. When this TV is occupying 40 degrees of your field of view horizontally, you are standing 7.48 ft away from it. At that point, it has angular resolution (what really matters) of 46 pixels per degree.At usual distance of 9 ft, it has angular resolution of 55.34 pixels per degree. Which isn't bad, but could be better. For example, when angular resolution (number of pixels per degree of person's field of view) is ~ 35 or higher, viewer usually can't see pixel grid / pixelation on images. On the other side, angular resolution of ~200 pixels per degree is limit for most of healthy individuals (NHK study), meaning image quality is matching the limit of viewer's eye and is as good as it can be concerning resolution and viewing distance.Benefit is, of course, non-linear. You'll benefit most with angular resolution up to ~100 pixels per degree. And then up to 200 ppd, quality will rise slowly.At usual distance of 9 ft, it has angular resolution of 55.34 pixels per degree. Which isn't bad, but could be better. For example, when angular resolution (number of pixels per degree of person's field of view) is ~ 35 or higher, viewer usually can't see pixel grid / pixelation on images. On the other side, angular resolution of ~200 pixels per degree is limit for most of healthy individuals, meaning image quality is matching the limit of viewer's eye and is as good as it can be concerning resolution and viewing distance.Benefit is, of course, non-linear. You'll benefit most with angular resolution up to ~100 pixels per degree. And then up to 200 ppd, quality will rise slowly.Anything more clear to you two?[/citation]
actually you are going over my head abit with that.
here is what i understand,
the human eye can only see about 300dpi of detail. yes i know that that a general statement and depends on the distance and such, but its a good starting off point.
at a reasonable viewing distance, say a living room, im guessing the tv is at a wall, and where you sit is against another, or close to that. now i got out a measure's tape to get these distances, based on my bedroom and not living room viewing distance. my head is a good 33 inches away from screen on average, sitting there at the computer, when watching something, i am 7 to 9 feet away, depending how im laying on my bed. and this is my fairly small bedroom, i'm assuming most people living room, you are a good 10 feet away minimum, with ours its anywhere from 10 feet to 20 feet.
from what i know, and this was from me looking this crap up a long long time ago, out 48 inch tv at almost any comfortable viewing distance you would be had pressed to see the difference between 1080p and 720p, now if i want to sit 5 feet away from it, than yea, can see the difference fairly easily, but that is the problem again, how close you have to sit to a tv of even that size to see anything above 1080p
im almost willing to bet money that if you had 2 identical looking tvs, but one was set to 720p and one was set to 1080 and you set people down at a reasonable in home distance, and asked them which was which, they would be hard pressed to say.
the point im making is that the quality of the tv has hit an area where you do not need better for most circumstances. sure a projector because you can get that thing to display a massive image, and sure a retardedly large tv because than you can see the advantage of more pixles, or a computer screen because we are less than 3 feet away from it if you are sitting comfortably away.
what you wrote, the reason it went over my head the way it did was because it was all text... do you know a site i can go to where it will explain what you are talking about better? i would really live to learn more about this.