This is sort of question that vastly depends on the display,.the movie or TV show and the actual master. Some Dolby specs are just stupid like supporting up to 10K bits. What display can get that bright in even a small window?
Most movies are mastered at 1000 nits, so even if your TV gets brighter (in a smaller window) then probably not a lot. Nobody is changing the values on a frame by frame basis, scene by scene at most. Up until recently the cheapest 4000 nit mastering monitor cost about 100K. Sony recently released one for 25K which is probably why their flagship this year is mini led woth new backlight controllers the size of a sesame seed. I wouldn't be surprised if you start seeing movies with a label on it saying "mastered at 4000 nits" in the near future.
The second is the source. To me LLDV (player led) Dolby Vision which streaming services use is pretty indistinguishable from HDR to my eyes. Now a UHD disks using Dolby Vision profile 7 FEL (Full Enhancement Layer) with display led is easily noticable to my eyes compares to HDR on the same disk. Unfortunately UHD is dying slowly but a FEL disk has 2 layers, one with the main movie in HDR 10 and another 1080p layer with the Dolby Vision information so it can always fall back to HDR. When the display does the dynamic tone mapping, on a good OLED it is easily noticable.
With LLDV the player does the dynamic tone mapping. With display led the display does it. Which do you think will handle it better? A 200 dollar streamer or a 2K Sony/LG OLED. Pretty sure the Sony or LG will win in every scenario because the display knows what it's capable of not to mention Sony's amazing video processing.
LLDV to.my knowledge.was meant for games but streaming services use it. Since the player/game console does the dynamic tone I think there is no lag. With display led different TV's take different amounts of time to do this. Why streaming services use LLDV is beyond me. It's like Dolby MAT. It sends everything out as PCM with Atmos.embedded metadata. Personally, I would rather my AV receiver do the decoding because that's what its for.
Technically you can get LLDV on any HDR display with a device like hdfury or cheaper alternatives out now. They just change the HDR hex value to match what it's being sent via LLDV and changes it on the fly.
I think at the end of the day the master and display matter the most. Sony's demo of a beach during the sunset had the sun at over 3000 nits in a 10% window while the rest of the screen was 800 bits. Obviously it was demo material mastered at 4000 nits and right now that content is rare but future movies will start slowly migrating now that the mastering.monitors are cheaper as a 1000 nit mastering monitor is still 10K plus.