Can 10 bit / 12 bit HDR be applied to 1080p resolution?

Status
Not open for further replies.

old_rager

Honorable
Jan 8, 2018
15
0
10,560
My son and I have been having a bit of discussion RE: HDR (primarily 10bit/12bit) and whether it can be applied at the 1080p resolution. He tends to believe that when it comes to gaming consoles (Xbox One X and PS4 Pro), that you can get a game to run in 1080p with HDR 10bit / 12 Bit being effective (I.E., noticeable difference to standard 1080p).

Thoughts anyone?
 
Solution


There are a few misconceptions that I have seen presented in this thread. First, dynamic range and bit depth are not related. For example, theoretically, you could have a 2-bit image (four tone values) that was "HDR" if the correct gamma curve was applied to it. You would have only four shades to work with, but the image would technically be "high dynamic range". The...

InvalidError

Distinguished
Moderator
Simple: HDR came about at roughly the same time as 4k. Since you need new equipment for 4k and new equipment for HDR, most companies are focusing on 4k HDR instead of promoting multiple resolutions and fragmenting the market.

At the rate 4k HDR TV prices are plummeting, even 1080p soon won't make any sense.
 

FranticPonE

Commendable
Jan 11, 2017
1
0
1,510
So the lowest resolution currently available hdr screens are 1440p monitors. Well 1 monitor, the Samsung Chg70 but it's not the greatest. More should be coming at the beginning of this year but there's no word of when exactly. Regardless the PS4 and Xbox One S also support HDR output for what it's worth though oddly HDR game support can be more spotty on them despite the actual power required not being much more than standard output. Just be careful when buying any HDR displays today, they have a large range in quality and the absolute only "future proof" ones, or even close, are multi thousand dollar laser projectors.
 

old_rager

Honorable
Jan 8, 2018
15
0
10,560
Not sure what point FranticPonE was trying to make as (it's one thing for XB1/PS4Pro to be HDR compliant, and another to actually see it working on a display where you can turn it on/off), but with respect to this post's original intention.

1. I have never seen a 1080p video file that is actually HDR compliant (I.E., I would see a notable difference in the colour depth of a 1080p movie being played on a HDR compliant display as opposed to it being played on a non-HDR compliant display, and
2. I have never seen a 1080p display advertised anywhere that says it is HDR 10/12 compliant.

Care to send me a link to either a 1080p HDR file or 1080p display that is HDR compliant?!? ;-)

 

robert600

Distinguished
I have yet to figure out what HDR exactly is.

"1. I have never seen a 1080p video file that is actually HDR compliant (I.E., I would see a notable difference in the colour depth of a 1080p movie being played on a HDR compliant display as opposed to it being played on a non-HDR compliant display"


The HEVC codec allows for 10 bit color depth encodes at 1080p. (I'm not sure that this is HDR though). In any event, I am quite sure you would notice a difference playing these files through a devise that supports 10 bit colour depth (is this HDR?) as opposed to one that only supports 8 bit.

"2. I have never seen a 1080p display advertised anywhere that says it is HDR 10/12 compliant.

Care to send me a link to either a 1080p HDR file or 1080p display that is HDR compliant?!? ;-)"

My 1080p projector makes no claim to be HDR compliant. But ... what it does do is claim 1080p with 1.07 billion colors. Doing the math - 2 to the 10th power is 1024 and 1024 to the 3rd power (3 color channels) = 1,073,741,824. This implies it is fully processing 10 bit color depth. Does this mean it is 'HDR compliant'? ... I have no idea.
 

InvalidError

Distinguished
Moderator

Go download some animes from the same source and resolution in 10bits h264 and compare that to the same episode in 8bits h264. The 10bits version will usually be smaller and have much less visible banding and macro-blocking even when played on an 8bits monitor simply because allowing the encoder to work with 10bits of resolution reduces cumulative error between key frames so the encoder does not need to waste as much bandwidth on trying to correct encoding errors/noise.
 

old_rager

Honorable
Jan 8, 2018
15
0
10,560
Het Robert600, did a bit more research and can confirm that I've been able to get Windows 10 to transmit HDR at 1080p, which I guess is something - but it is not 1080p HDR content. I thought I found monitor does 1080p as well as HDR (as this it what it claims - https://www.benq.com/en/monitor/video-enjoyment/ew277hdr/specifications.html), but when looking closely at the specs, it only supports 8bit and has a nit rating of 300 to 400 nits). From another article I read from Nov 2017 (https://dgit.com/4k-hdr-guide-45905/), it states that 8bit is HDR - but I guess for me when viewing 8bit, it really does look bland when compared to 10bit HDR.

IMO, HDR is just simply amazing! I recently purchased a Samsung 65" Q7F QLED TV that runs at 1500 nits and displays 100% of the colour gamut (the above BenQ monitor only displays 93% of the colour gamut). The difference for me in layman's speak is the richness of the colour - especially on the XBox One X. The colours that we see that weren't there before when displaying the same content on a non HDR TV was amazing - massively different playing experience altogether. It's also the same when watching movies. When it comes to downloading 4K movies, 8bit is always used in reference to SDR (Standard Dynamic Range) and 10/12bit is always referred to as HDR.

I guess that one of the things that is possibly clouding my judgment is that our new TV is 1500 nits (their are not many TVs out there that do 1500 nits), it also displays 100% of the DCI P3 colour gamut (again, very few TVs display this) and supports an elevated HDR10 standard which Samsung refers to as HDR+ or HDR10+ (which has something to do with glare reduction - thereby ensuring that what you see is being received by yours eyes without any other additional environmental image degradation). This might also sound like a sales pitch for Samsung, but it's not - as there are some issues with our TV that are handled better by Sony's XE90 for instance at a similar price.

I guess the reason why I am so intent on finding 10bit HDR 1080p movie content is due to their still being a very limited amount of 4K content. So if I've got to get stuff in 1080p, I'd love for it to be HDR10 compliant - but as yet, I have found no 10bit HDR 1080p content. Hope this makes some sense...
 

robert600

Distinguished
"So if I've got to get stuff in 1080p, I'd love for it to be HDR10 compliant - but as yet, I have found no 10bit HDR 1080p content. Hope this makes some sense... "

Well ... I really do not know what HDR10 compliant truly means. As I mentioned the HEVC codec (h 265) allows for 10 color depth encoding - apparently, according to InvalidError, so does the h264 codec but I haven't used h264 in years so I don't have any personal experience with it. What I can't seem to work out is whether or not the 10 bit color depth files video generated by these codecs are HDR files. I guess what I'm wondering is ... if a video file has 10 bit color depth does that mean it is an HDR file? Or, is HDR something more than 10 bit color depth?

If you're interested here's the video portion description (from mediainfo) of one of my HEVC files encoded at 10 bit (you can readily see that it's 1080p with a 10 bit color depth):

Video
ID : 1
Format : HEVC
Format/Info : High Efficiency Video Coding
Format profile : Main 10@L4@Main
Codec ID : V_MPEGH/ISO/HEVC
Duration : 1 h 55 min
Bit rate : 4 001 kb/s
Width : 1 920 pixels
Height : 804 pixels
Display aspect ratio : 2.40:1
Frame rate mode : Constant
Frame rate : 23.976 (24000/1001) FPS
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 10 bits
Bits/(Pixel*Frame) : 0.108
Stream size : 3.22 GiB (89%)
Writing library : x265 2.5+66-dae558b40d99:[Windows][GCC 7.2.0][64 bit] 10bit
Encoding settings : cpuid=1173503 / frame-threads=3 / numa-pools=8 / wpp / no-pmode / no-pme / no-psnr / no-ssim / log-level=2 / input-csp=1 / input-res=1920x804 / interlace=0 / total-frames=165859 / level-idc=0 / high-tier=1 / uhd-bd=0 / ref=4 / no-allow-non-conformance / no-repeat-headers / annexb / no-aud / no-hrd / info / hash=0 / no-temporal-layers / open-gop / min-keyint=23 / keyint=250 / bframes=4 / b-adapt=2 / b-pyramid / bframe-bias=0 / rc-lookahead=25 / lookahead-slices=4 / scenecut=40 / no-intra-refresh / ctu=64 / min-cu-size=8 / rect / no-amp / max-tu-size=32 / tu-inter-depth=1 / tu-intra-depth=1 / limit-tu=0 / rdoq-level=2 / dynamic-rd=0.00 / no-ssim-rd / signhide / no-tskip / nr-intra=0 / nr-inter=0 / no-constrained-intra / strong-intra-smoothing / max-merge=3 / limit-refs=3 / limit-modes / me=3 / subme=3 / merange=57 / temporal-mvp / weightp / no-weightb / no-analyze-src-pics / deblock=0:0 / sao / no-sao-non-deblock / rd=4 / no-early-skip / rskip / no-fast-intra / no-tskip-fast / no-cu-lossless / no-b-intra / no-splitrd-skip / rdpenalty=0 / psy-rd=2.00 / psy-rdoq=1.00 / no-rd-refine / analysis-reuse-mode=0 / no-lossless / cbqpoffs=0 / crqpoffs=0 / rc=abr / bitrate=4000 / qcomp=0.60 / qpstep=4 / stats-write=0 / stats-read=2 / cplxblur=20.0 / qblur=0.5 / ipratio=1.40 / pbratio=1.30 / aq-mode=3 / aq-strength=1.00 / cutree / zone-count=0 / no-strict-cbr / qg-size=32 / no-rc-grain / qpmax=69 / qpmin=0 / no-const-vbv / sar=0 / overscan=0 / videoformat=5 / range=0 / colorprim=2 / transfer=2 / colormatrix=2 / chromaloc=0 / display-window=0 / max-cll=0,0 / min-luma=0 / max-luma=1023 / log2-max-poc-lsb=8 / vui-timing-info / vui-hrd-info / slices=1 / no-opt-qp-pps / no-opt-ref-list-length-pps / no-multi-pass-opt-rps / scenecut-bias=0.05 / no-opt-cu-delta-qp / no-aq-motion / no-hdr / no-hdr-opt / no-dhdr10-opt / analysis-reuse-level=5 / scale-factor=0 / refine-intra=0 / refine-inter=0 / refine-mv=0 / no-limit-sao / ctu-info=0 / no-lowpass-dct / refine-mv-type=0 / copy-pic=1
Default : Yes
Forced : No
 

robert600

Distinguished
Actually that one has a display aspect ratio of 2.40:1 so the res is 1920 x 804 (technically not 1080p I guess). Here's a better example with a 16:9 aspect ratio and thus fully 1080p:

Video
ID : 1
Format : HEVC
Format/Info : High Efficiency Video Coding
Format profile : Main 10@L4@Main
Codec ID : V_MPEGH/ISO/HEVC
Duration : 1 h 46 min
Bit rate : 4 749 kb/s
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 23.976 (24000/1001) FPS
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 10 bits
Bits/(Pixel*Frame) : 0.096
Stream size : 3.54 GiB (90%)
Writing library : x265 2.5+66-dae558b40d99:[Windows][GCC 7.2.0][64 bit] 10bit
Encoding settings : cpuid=1173503 / frame-threads=3 / numa-pools=8 / wpp / no-pmode / no-pme / no-psnr / no-ssim / log-level=2 / input-csp=1 / input-res=1920x1080 / interlace=0 / total-frames=153405 / level-idc=0 / high-tier=1 / uhd-bd=0 / ref=4 / no-allow-non-conformance / no-repeat-headers / annexb / no-aud / no-hrd / info / hash=0 / no-temporal-layers / open-gop / min-keyint=23 / keyint=250 / bframes=4 / b-adapt=2 / b-pyramid / bframe-bias=0 / rc-lookahead=25 / lookahead-slices=4 / scenecut=40 / no-intra-refresh / ctu=64 / min-cu-size=8 / rect / no-amp / max-tu-size=32 / tu-inter-depth=1 / tu-intra-depth=1 / limit-tu=0 / rdoq-level=2 / dynamic-rd=0.00 / no-ssim-rd / signhide / no-tskip / nr-intra=0 / nr-inter=0 / no-constrained-intra / strong-intra-smoothing / max-merge=3 / limit-refs=3 / limit-modes / me=3 / subme=3 / merange=57 / temporal-mvp / weightp / no-weightb / no-analyze-src-pics / deblock=0:0 / sao / no-sao-non-deblock / rd=4 / no-early-skip / rskip / no-fast-intra / no-tskip-fast / no-cu-lossless / no-b-intra / no-splitrd-skip / rdpenalty=0 / psy-rd=2.00 / psy-rdoq=1.00 / no-rd-refine / analysis-reuse-mode=0 / no-lossless / cbqpoffs=0 / crqpoffs=0 / rc=abr / bitrate=4750 / qcomp=0.60 / qpstep=4 / stats-write=0 / stats-read=2 / cplxblur=20.0 / qblur=0.5 / ipratio=1.40 / pbratio=1.30 / aq-mode=3 / aq-strength=1.00 / cutree / zone-count=0 / no-strict-cbr / qg-size=32 / no-rc-grain / qpmax=69 / qpmin=0 / no-const-vbv / sar=0 / overscan=0 / videoformat=5 / range=0 / colorprim=2 / transfer=2 / colormatrix=2 / chromaloc=0 / display-window=0 / max-cll=0,0 / min-luma=0 / max-luma=1023 / log2-max-poc-lsb=8 / vui-timing-info / vui-hrd-info / slices=1 / no-opt-qp-pps / no-opt-ref-list-length-pps / no-multi-pass-opt-rps / scenecut-bias=0.05 / no-opt-cu-delta-qp / no-aq-motion / no-hdr / no-hdr-opt / no-dhdr10-opt / analysis-reuse-level=5 / scale-factor=0 / refine-intra=0 / refine-inter=0 / refine-mv=0 / no-limit-sao / ctu-info=0 / no-lowpass-dct / refine-mv-type=0 / copy-pic=1
Default : Yes
Forced : No


 

old_rager

Honorable
Jan 8, 2018
15
0
10,560


You are correct - but I'm also with you in that where in the source file (I.E., Blue Ray rip) where the files are in m2ts format, where does it say that the m2ts file is 8bit, 10bit or even 12bit format for that matter - I'm not sure. I am still trying to get my head around all of this, but I can definitely say that there is a big difference between 8bit & 10bit colour depth display - and the cost for upgrading to a new display has been worth every cent (or many dollars in my case... ;)
 
Jul 21, 2018
1
0
20


There are a few misconceptions that I have seen presented in this thread. First, dynamic range and bit depth are not related. For example, theoretically, you could have a 2-bit image (four tone values) that was "HDR" if the correct gamma curve was applied to it. You would have only four shades to work with, but the image would technically be "high dynamic range". The reason HDR standards are 10-bit or higher is because the additional values are needed to prevent the banding which would occur if 8-bit depth was used with an HDR gamma curve. Currently, all HDR content is 10-bit, with black mapped at the 64 value, middle gray at 512 and peak luminance at 940/960.


 
Solution

bdg2

Honorable
Jun 27, 2014
1
0
10,510
There seems to be at least one HDR TV with less than 1080p resolution but none with 1080p which is what I want.
So I'll be forced to pay more for 4K or forget HDR.
Damn.
 

InvalidError

Distinguished
Moderator

I don't see the problem here considering how HDR 4k TVs are coming down in prices rather quickly and are about to reach price parity with plain 1080p.

Also, due to 4k being the prevalent resolution for HDR-capable TVs, the vast majority of HDR content will be produced in 4k. My next "PC monitor" is probably going to be a 43-50" 4k HDR TV to replace my triple monitor setup.
 
Sep 8, 2018
1
0
10
My TV (Sony Bravia 52z5100) accepts 0-255 and is a 10 bit panel from 2009, and is a 240Hz 1080p. It was over $3k new at the time, but accepts my 10 bit feed with full range color from my Xbox one S without issues.
 
Sep 18, 2018
1
0
10
Just stumbled across this thread and want to chime in that my ~6yo "Sony KDL-47W705" prompts "1920x1080 12bit" when it recevies a Signal from a 4k-HDR-capable Android-TV-box (any S912-device for those who want to know).

That puzzled me with the question at hand: if there is HDR with FullHD and if this old (still really good) W7 can display HDR-content.

E.g. when I play a 4k-HDR-Stream with this HDR-capable TV-Box, or when I render a 4k-HDR-Source to 12bit-FHD (DVD-Fab or sth. similar)
 
Status
Not open for further replies.