Hi! I have run into a somewhat peculiar issue with my TV which requires some explanation, so please bear with me! I've run into a brick wall here and I need some insight.
I bought this Thomson 55FU4243 HDTV for cheap some time ago - and while it's mostly been a nicely performing 1080p TV, I've always thought that it had some serious trouble correctly displaying warm colors in the red/yellow spectrum (faces look like flat, red blobs. Yuck).
Recently, however, I accidentally realized that connecting it to my PC at a resolution of 1366x768 somehow bypassed the entire image processing of the TV. I could no longer adjust sharpness or saturation, but the picture was drastically improved. Finally normally colored faces! So what do? The scientific approach!
I started diving into custom resolutions to see where the image processing is triggered. Turns out that any single pixel above 1366x768 activates image processing and ruins the picture. So I opened the advanced settings for my nVidia drivers and started toying around with different timings. Turns out that running a 1920x1080 60hz resolution triggers image processing on every timing preset except CVT (not reduced blanking).
So in short, 1920x1080 60hz with CVT-timing gives me an excellent picture that far surpasses the TV's default mode simply because it somehow makes the TV bypass its internal image processing. Great news for my PC content - but not so much for everything else I use the TV for, like all of my gaming consoles.
So my question is as follows: Any ideas on how I can convert a normal 1080p HDMI signal from any source into a 1920x1080 60hz signal specifically with CVT timing? I'm willing to try almost anything that doesn't require me to buy an entirely new TV.
EDIT: Adding some pictures to show you the difference I'm talking about.
This is a regular 1080p signal over HDMI from my computer:
And this is over HDMI from my computer, but with forced CVT timing:
I'd say that's a pretty drastic improvement - I just need help getting that quality across all my HDMI devices, not solely from my computer!
I bought this Thomson 55FU4243 HDTV for cheap some time ago - and while it's mostly been a nicely performing 1080p TV, I've always thought that it had some serious trouble correctly displaying warm colors in the red/yellow spectrum (faces look like flat, red blobs. Yuck).
Recently, however, I accidentally realized that connecting it to my PC at a resolution of 1366x768 somehow bypassed the entire image processing of the TV. I could no longer adjust sharpness or saturation, but the picture was drastically improved. Finally normally colored faces! So what do? The scientific approach!
I started diving into custom resolutions to see where the image processing is triggered. Turns out that any single pixel above 1366x768 activates image processing and ruins the picture. So I opened the advanced settings for my nVidia drivers and started toying around with different timings. Turns out that running a 1920x1080 60hz resolution triggers image processing on every timing preset except CVT (not reduced blanking).
So in short, 1920x1080 60hz with CVT-timing gives me an excellent picture that far surpasses the TV's default mode simply because it somehow makes the TV bypass its internal image processing. Great news for my PC content - but not so much for everything else I use the TV for, like all of my gaming consoles.
So my question is as follows: Any ideas on how I can convert a normal 1080p HDMI signal from any source into a 1920x1080 60hz signal specifically with CVT timing? I'm willing to try almost anything that doesn't require me to buy an entirely new TV.
EDIT: Adding some pictures to show you the difference I'm talking about.
This is a regular 1080p signal over HDMI from my computer:
And this is over HDMI from my computer, but with forced CVT timing:
I'd say that's a pretty drastic improvement - I just need help getting that quality across all my HDMI devices, not solely from my computer!