Can 10 bit / 12 bit HDR be applied to 1080p resolution?

Page 2 - Seeking answers? Join the Tom's Guide community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

old_rager

Honorable
Jan 8, 2018
15
0
10,560
My son and I have been having a bit of discussion RE: HDR (primarily 10bit/12bit) and whether it can be applied at the 1080p resolution. He tends to believe that when it comes to gaming consoles (Xbox One X and PS4 Pro), that you can get a game to run in 1080p with HDR 10bit / 12 Bit being effective (I.E., noticeable difference to standard 1080p).

Thoughts anyone?
 
Solution


There are a few misconceptions that I have seen presented in this thread. First, dynamic range and bit depth are not related. For example, theoretically, you could have a 2-bit image (four tone values) that was "HDR" if the correct gamma curve was applied to it. You would have only four shades to work with, but the image would technically be "high dynamic range". The...
Sep 28, 2018
1
0
10


Hi Robert, HDR=High Dynamic Range (bad name) means that each pixel can be displayed in a higher range of luminosity (e.g. brightness in laymen's term. Not same as "brightness" term used in display industry). So, HDR can produce deeper blacks, much higher whites or bright colours (much higher NITs), but also in many different levels in between the previous SDR range. So while SDR could say produce blue (or any color really within color gamut of display) in increments of say 20 NITs; so 50, 70, 90, 110 NITs...etc, HDR can produce the same colors in increments of 5 NITs (I'm making these numbers up. I have no idea what is specified within each standard. Just meant as an example to help understand the difference between SDR & HDR).

I encourage anyone who knows more than I do to correct or improve on my above understanding,

Matt

 
Nov 30, 2018
1
0
10
https://www.sony.co.uk/electronics/televisions/we750-we752-we753-we754-we755-series

Here is my TV, which supports HDR and it is 1080p. I chose 1080p TV over 4k. Most of the content is still 1080p so I get perfect picture 1:1 pixel with no downscaling , upscaling. Especially for games video cards are not ready to deliver 4k experience so 1080p looks sharp. Strange thing about HDR on this TV I recently discovered. I always played HDR games on full rgb 8bit or 12bit because this TV supports full rgb , I could see how the color is different form SDR , and the image was brighter, but few days ago I did a test with different color formats and it seems the whole time I was not seeing the correct image how it supposed to be. Choosing YCbCr422 and 10bit made a huge difference. Now I see colours in the sky and everywhere which were not present in any other color format. They simply were not shown. All this time I thought there is no difference which color format I choose because it is a 8bit TV and choosing 10bits won't make a difference. But it does and it's huge. I can only choose 10bit in YCbCr422 for some reason but in HDR content it looks 10times better than full rgb.

Update : in specs I see HEVC SUPPORT 10bit. So I guess this HDTV is 10bit. There was no such info when I bought it. So I assumed it is 8bit panel.
 

InvalidError

Distinguished
Moderator

HEVC is h265 and 10bits is an extension/option in that standard. Having firmware/hardware that supports 10b h265 decoding doesn't mean anything as far as the panel is concerned, can still be 8bits+FRC or whatever else.
 
Status
Not open for further replies.