I made a quick test with an 1080 H.246 video I have on hand. I'm running mpv on Gnome Wayland on a Skylake i7-6700HQ (45 W TDP) laptop. It uses 21 W in idle, 25 W, 10% CPU, with hardware decoding (iGPU), 31 W 70% CPU with software decoding.
I also tried with a 10-bit 4K HEVC and it's unwatchable with both software and hardware decoding at 58 W (powersave governor) / 65 W (performance). I haven't tested on Windows.
Announcement
Collapse
No announcement yet.
Firefox 71 Landing Wayland DMA-BUF Textures Support
Collapse
X
-
Originally posted by atomsymbol View PostThe meaning is obvious.
If it's the 6700k with HTT, it would likely mean physical cores barely have more than one quarter of computing power unutilized, which completely backs my point.
Leave a comment:
-
Originally posted by aufkrawall View PostWhat is "300%" supposed to mean? CPU total usage also is questionable with hyperthreading enabled.
Originally posted by aufkrawall View PostAnyhow, it's the peak bitrate that's crucial, and that YouTube 4k 60fps VP9 8 bit video caused dropped frames for me with my previous 2500k OC in Firefox.
Originally posted by aufkrawall View PostNot to speak of mobile devices, noise, heat and limited background capabilities due to expensive software decoding.- It appears CPUs, including mobile ones, are going to have matrix instructions due to the rising popularity of NN AI algorithms
- Per-core power consumption will be smaller by a factor of 2 in a couple of years
Leave a comment:
-
Originally posted by atomsymbol View PostCPU utilization is below 300% in both cases.
Anyhow, it's the peak bitrate that's crucial, and that YouTube 4k 60fps VP9 8 bit video caused dropped frames for me with my previous 2500k OC in Firefox.
Not to speak of mobile devices, noise, heat and limited background capatibilities due to expensive software decoding.
- 1 like
Leave a comment:
-
Originally posted by aufkrawall View PostMy 6700k is well overchallanged with 4k 60fps HEVC 10 bit high bitrate.Code:$ mplayer -vo yuv4mpeg:file=/dev/null "4K HEVC 59.940 Broadcast Capture Sample.mkv" $ mplayer -vo yuv4mpeg:file=/dev/null "Big Buck Bunny 60fps 4K - Official Blender Foundation Short Film-aqz-KE-bpKQ.mkv"
https://kodi.wiki/view/Samples#4K_.28UltraHD.29 (HEVC 10-bit 59.940fps (Korean ATSC 3.0 satellite TV capture sample))
VP9 8-bit 4K 60Hz source:
CPU utilization is below 300% in both cases.Last edited by atomsymbol; 10 October 2019, 06:57 AM.
- 1 like
Leave a comment:
-
Originally posted by starshipeleven View PostPower consumption of CPU decoding is still orders of magnitude higher than using a dedicated hardware decoder, which is significant for most modern computing usage (laptop and mobile devices), while still mostly irrelevant for a desktop system.
Mobile devices (smartphones) have much slower CPUs (unable to decode 1080p and 4K videos), but the same display resolution (1080p, 4K), compared to notebooks and desktops. 4K resolution on a smartphone is the maximum resolution such devices will ever need to have because the benefit from going to an 8K+ resolution on a smartphone is imperceptible. CPU performance on mobile devices is going to steadily improve, while the maximum display resolution is going to stagnate at 4K, which implies that there is going to be a time-point in the future where a smartphone CPU will be able to decode 4K video without requiring assistance from the GPU.
Originally posted by cl333r View PostWhat? You say it like it's common otherwise.
AFAIK no CPU even with AVX2 is nearly equal power consumption wise to specialized video decoding in the GPU (if both are from the same year and vendor).
Leave a comment:
-
Originally posted by starshipeleven View Postyes. Disagreeing does not make you right.
And it's so loud that you hear it from inside a wardrobe too? It's worse than I thought. What is that, a rack server?
Leave a comment:
-
Originally posted by atomsymbol View Post
Some notes:- Nowadays 3 CPU cores with AVX2 are able to decode any video type up to 4K 10-bit HDR 60Hz, the CPU might only have issues handling 8K videos
- GPU-assisted decoding is preferable when it results in smaller system power consumption compared to CPU-only video decoding or when the CPU is busy handling other tasks in addition to video decoding
- Decoded 4K 10-bit HDR 60Hz requires about 1-2 GiB/s of memory bandwidth. Main memory bandwidth and PCI Express bandwidth are greater than 2 GiB/s.
- From historical perspective, HW acceleration of video decoding in x86 CPUs started with the Pentium MMX (released in January 1997)
For me it's that and a pain if you're compiling and want to watch a video while you wait. People might have the same problem who use Blender or Tensorflow on their workstations. I agree with others it's also a waste, noise or power draw wise, if you're on mobile device.
It has not prevented me from switching to chromium, however I have gone out of my way to watch videos in my Windows VM (VFIO) or other devices.
Leave a comment:
- Nowadays 3 CPU cores with AVX2 are able to decode any video type up to 4K 10-bit HDR 60Hz, the CPU might only have issues handling 8K videos
-
Originally posted by cl333r View PostNo.
There's this thing that when the CPU is under heavier load the CPU fan spins faster and makes more noise (that I can actually hear), I thought it's obvious.
Leave a comment:
Leave a comment: