ok Im using an hdmi 1.3b cable that came out in 2001 on a 4k tv hooked to a device that supports 4k and according to my tv and the device its hooked too Im getting 2160p at 60p and the device shows that its receiving an HDR RGB (instead of YUV) signal and supports HDCP 2.2
How is that possible? Shouldnt using a 1.3b or 1.4 hdmi cable only get 2160p at 30p/hz and probably no HDR signal at all?
I could be wrong but I thought the differences between HDMI cables were software only and older cables can be used for higher res/refresh rate.
This article may shed some light.
Well that for a fact isnt true. While there IS no difference between monster cables that cost hundreds and normal cables here is a difference with normal and high speed one only has a 10gbps transfer rate while another has like 16gbps rate which is needed for 4k. Or so Ive read