Uncompressed video is absolutely massive so any streamed content has to be heavily compressed (encoded) to make it workable on even high speed internet services.
The compression/encode process uses really complex algorithms to reduce the video size while maintain as high a quality as possible.
These algorithms are designed to be as simple as possible to decode - that makes sense because every time you watch the content you have to first decode it, so a simpler decode process makes it possible to watch the content on older and slower computers and results in less power (and longer battery life) on portable devices.
BUT, they all take massive computational resources to ENcode the content in the first place. That's why you can watch (decode) 4K content live, but you can't get anywhere near streaming (encoding) 4K content live.
Most graphics cards (IGPs) have dedicated video decode hardware built in. That means that well programmed software can take key stages of the video decode process away from the CPU and allow it to be completed by those hardware resources, which are tailor-made to decode video.
The issue is that the software you're using to play the videos needs to be aware of your hardware and how to utilise the dedicate decode. Some software is better at this than others.
Have you tried other web browsers? I gets tons of dropped frames trying to play 1080p@60 content on my Surface using Google Chrome, plus it melts the battery and the device gets crazy hot. But if I play the same content on IE or Edge it plays seamlessly, stays cool and lasts 3 times as long.
The Intel HD 4400 graphics has dedicated decode hardware. Chrome doesn't use it effectively and IE does... which makes all the difference in the world.
What's your graphics cards? Have you tried different browsers yet? Got the latest drivers installed?
Sorry for the confusion with my initial post - I thought you meant streaming as in live game-streaming.
Why would you say your 30mbps connection isn't the problem? If you were trying to watch 4k@60fps then that's definitely not enough. I just tested on our gigabit connection and according to task manager 4k@60fps, from YouTube, is averaging about 40mpbs with spikes north of 60mbps.
Also, as far as I know, YouTube isn't using any formats for 4k that can be hardware decoded. It's completely software decode on the CPU so you better have a beefy CPU. Again, with my test, I'm at about 80% CPU usage with a Xeon E3-1231 v3 turboing all the way up to 3.8 Ghz.
Why would you say your 30mbps connection isn't the problem?
Because i can clearly see that the video is being downloaded (i dont know if it's the good term) faster than im playing it. (the light gray is faster than the red...)
Have you tried other web browsers?
I have tried Microsoft Edge like you told me too but the video was like in slow motion and after 5 seconds it would skip all the frames to catch up with the video. Btw, i used firefox when i posted this thread.
What's your graphics cards?
My graphics card is an Asus HD 6570.
Got the latest drivers installed?
I have tried updating my drivers, but the 4k streaming from youtube was pretty much the same.
YouTube is known to have performance issues with Firefox. What is your CPU utilization while watching the video? What CPU do you have?
Since you're using Firefox and HTML5 then I can guarentee you YouTube is using VP9 format. Your GPU doesnt support hardware acceleration for VP9 (in fact I don't know of any GPU currently on the market that does). That means your CPU is doing all the decoding.