Issue with 4k/UHD Remux - playback looks poor quality + 0% GPU usage

euphoria4949

Honorable
Aug 26, 2012
9
1
10,520
Hi everyone, I didn't know where to post this.
I don't know a huge amount about video tech and playback, so please explain like I'm 5.

My issue is; every 2160p Blu-ray rip, whether 80GB HEVC Remux or 10GB compressed, looks the same quality, and that quality isn't great.
I've played two different copies of the same movie side by side; one copy being 1080p in .mkv (5GB size, 5000kbps bitrate) and the other 2160p HEVC Remux in .mkv (80GB+ size, 65,000kbps), yet both look pretty much the same. A 1080p YouTube video looks just as sharp and clear.
I've had this issue for a while now so I've tried a few suggestions over recent months with different settings to tweak, VLC and MPC-HC, Windows 7 and Windows 10 (both x64), .mkv and other formats, but still, they don't look that good and they certainly don't look as good as the quality of an actual 1080p Blu-ray disc playback.

This is a screenshot of a movie: 2160p HEVC Remux BDrip, 75GB file size, 65,000kbps bitrate - https://i.imgur.com/Tsrxdcd.jpg
I had another copy of the same movie in much lower quality as my son's laptop is a bit old now and struggles with high bitrates. His copy was around 8GB, 1080p, 7000kbps and I could barely tell the difference between them. I don't have a screenshot as he watched it and deleted it.
During playback I've noticed that on both VLC and MPC my CPU is pretty much pinned at 100% usage, while my GPU/s are unused, literally 0-3% usage and at 200MHz idle clock speeds.

System specs:
i7 3770k @5GHz
32GB RAM
GTX 980Ti's (2-way SLI) - I've tried disabling one in case it was an SLI issue but my GPUs have 0% load.
Windows 10 (also tried Win7) x64
Everything on SSDs, no internal HDDs
VLC and MPC-HC both at default settings.
All drivers and players up to date and no 3rd party codex packs installed

As said, my knowledge isn't great, but shouldn't the players be using my GPU/s mostly instead of CPU, could this be the reason or am I way off??
Any help or advice would be greatly appreciated.
Thank you
 

euphoria4949

Honorable
Aug 26, 2012
9
1
10,520


Both really. Sometimes if it's just me I play it through a monitor, but if the family are watching too, then through a TV.
 

gasaraki

Distinguished
Jun 11, 2008
61
0
18,610
If you are watching everything on a 1080p monitor and a 1080p TV, you will not see any difference between 1080P video and 4K videos because they are playing in 1080p only. Higher bitrates will only help in highly complex scenes like fires or explosions which have lots of color gradients and the higher bitrates will reduce micro blocking .
 

euphoria4949

Honorable
Aug 26, 2012
9
1
10,520


Wait, hold on, that's not strictly true. (Firstly, I'm not using 1080p monitors or TV.) But the main point here is, Downsampling an image like 2160p to a 1080p display Does make a noticeable difference and improvement in image quality. Yes, the display would still only be 1920x1080, but the image being processed is of much higher detail and effectively just shrunk down (so to speak). Downsampling is a common thing because it makes a noticeable difference, it's not as good as true 2160p playback on a 2160p monitor/TV, of course, but it's still very noticeable.

Anyway, my TV is 2160p and my monitor is 3440x1440.
 

gasaraki

Distinguished
Jun 11, 2008
61
0
18,610


I know down sampling and it's relationship with rendered graphics. But so you are saying that the best quality 1080p Blu-Ray movie displaying on a 1080p TV and a 4K UHD Blu-Ray displaying on the same 1080p TV that the 4K video would look better...
 

euphoria4949

Honorable
Aug 26, 2012
9
1
10,520
Sorry, I read that a few times and confused myself I think, haha (it's late). Is this what you meant??

Scenario 1 - A FHD (1080p) Blu-ray played on a TV/monitor that has a resolution of 1920x1080, native resolution.
Scenario 2 - A UHD (2160p) Blu-ray played on the same TV/monitor, UHD downsampled.

Scenario 2 would be noticeably better quality, sharper, clearer... just better. As the image being rendered is 4x the resolution and most likely 3-4x the bitrate but simply shrunk to fit the TV/monitor resolution.