Announcement

Collapse
No announcement yet.

Firefox 71 Landing Wayland DMA-BUF Textures Support

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • phoronix
    started a topic Firefox 71 Landing Wayland DMA-BUF Textures Support

    Firefox 71 Landing Wayland DMA-BUF Textures Support

    Phoronix: Firefox 71 Landing Wayland DMA-BUF Textures Support

    Firefox 71 is bringing another important Wayland improvement!..

    http://www.phoronix.com/scan.php?pag...ayland-DMA-BUF

  • GrayShade
    replied
    I made a quick test with an 1080 H.246 video I have on hand. I'm running mpv on Gnome Wayland on a Skylake i7-6700HQ (45 W TDP) laptop. It uses 21 W in idle, 25 W, 10% CPU, with hardware decoding (iGPU), 31 W 70% CPU with software decoding.

    I also tried with a 10-bit 4K HEVC and it's unwatchable with both software and hardware decoding at 58 W (powersave governor) / 65 W (performance). I haven't tested on Windows.
    Last edited by GrayShade; 10-11-2019, 04:49 PM.

    Leave a comment:


  • pal666
    replied
    Originally posted by atomsymbol View Post
    [*]Nowadays 3 CPU cores with AVX2
    not every cpu sold nowadays is avx2

    Leave a comment:


  • aufkrawall
    replied
    Originally posted by atomsymbol View Post
    The meaning is obvious.
    Ah, yes? Did you even specify the used CPU?
    If it's the 6700k with HTT, it would likely mean physical cores barely have more than one quarter of computing power unutilized, which completely backs my point.

    Leave a comment:


  • atomsymbol
    replied
    Originally posted by aufkrawall View Post
    What is "300%" supposed to mean? CPU total usage also is questionable with hyperthreading enabled.
    The meaning is obvious.

    Originally posted by aufkrawall View Post
    Anyhow, it's the peak bitrate that's crucial, and that YouTube 4k 60fps VP9 8 bit video caused dropped frames for me with my previous 2500k OC in Firefox.
    2500K was released in year 2011. It doesn't support AVX2. I don't have non-AVX2 CPU at the moment, so I am unable to test such a case in short time.

    Originally posted by aufkrawall View Post
    Not to speak of mobile devices, noise, heat and limited background capabilities due to expensive software decoding.
    • It appears CPUs, including mobile ones, are going to have matrix instructions due to the rising popularity of NN AI algorithms
    • Per-core power consumption will be smaller by a factor of 2 in a couple of years

    Leave a comment:


  • aufkrawall
    replied
    Originally posted by atomsymbol View Post
    CPU utilization is below 300% in both cases.
    What is "300%" supposed to mean? CPU total usage also is questionable with hyperthreading enabled.
    Anyhow, it's the peak bitrate that's crucial, and that YouTube 4k 60fps VP9 8 bit video caused dropped frames for me with my previous 2500k OC in Firefox.
    Not to speak of mobile devices, noise, heat and limited background capatibilities due to expensive software decoding.

    Leave a comment:


  • atomsymbol
    replied
    Originally posted by aufkrawall View Post
    My 6700k is well overchallanged with 4k 60fps HEVC 10 bit high bitrate.
    Code:
    $ mplayer -vo yuv4mpeg:file=/dev/null "4K HEVC 59.940 Broadcast Capture Sample.mkv"
    $ mplayer -vo yuv4mpeg:file=/dev/null "Big Buck Bunny 60fps 4K - Official Blender Foundation Short Film-aqz-KE-bpKQ.mkv"
    HEVC 10-bit 4K 60Hz source:
    https://kodi.wiki/view/Samples#4K_.28UltraHD.29 (HEVC 10-bit 59.940fps (Korean ATSC 3.0 satellite TV capture sample))

    VP9 8-bit 4K 60Hz source:
    CPU utilization is below 300% in both cases.
    Last edited by atomsymbol; 10-10-2019, 06:57 AM.

    Leave a comment:


  • atomsymbol
    replied
    Originally posted by starshipeleven View Post
    Power consumption of CPU decoding is still orders of magnitude higher than using a dedicated hardware decoder, which is significant for most modern computing usage (laptop and mobile devices), while still mostly irrelevant for a desktop system.
    An 8-core 35 Watt notebook CPU consumes about 35/8=4 Watts per core under load. If a single core is able to decode a video (without going to boost frequencies too much) the decoding process is consuming about 5 Watts of energy.

    Mobile devices (smartphones) have much slower CPUs (unable to decode 1080p and 4K videos), but the same display resolution (1080p, 4K), compared to notebooks and desktops. 4K resolution on a smartphone is the maximum resolution such devices will ever need to have because the benefit from going to an 8K+ resolution on a smartphone is imperceptible. CPU performance on mobile devices is going to steadily improve, while the maximum display resolution is going to stagnate at 4K, which implies that there is going to be a time-point in the future where a smartphone CPU will be able to decode 4K video without requiring assistance from the GPU.

    Originally posted by cl333r View Post
    What? You say it like it's common otherwise.
    AFAIK no CPU even with AVX2 is nearly equal power consumption wise to specialized video decoding in the GPU (if both are from the same year and vendor).
    Please provide computations and/or measurements (wall outlet or your notebook battery readings) so that we can see what the actual difference in power consumption or battery time is.

    Leave a comment:


  • cl333r
    replied
    Originally posted by starshipeleven View Post
    yes. Disagreeing does not make you right.


    And it's so loud that you hear it from inside a wardrobe too? It's worse than I thought. What is that, a rack server?
    If you don't have bad hearing you don't need much noise to hear it, isn't it?

    Leave a comment:


  • Jabberwocky
    replied
    Originally posted by atomsymbol View Post

    Some notes:
    • Nowadays 3 CPU cores with AVX2 are able to decode any video type up to 4K 10-bit HDR 60Hz, the CPU might only have issues handling 8K videos
      • GPU-assisted decoding is preferable when it results in smaller system power consumption compared to CPU-only video decoding or when the CPU is busy handling other tasks in addition to video decoding
      • Decoded 4K 10-bit HDR 60Hz requires about 1-2 GiB/s of memory bandwidth. Main memory bandwidth and PCI Express bandwidth are greater than 2 GiB/s.
    • From historical perspective, HW acceleration of video decoding in x86 CPUs started with the Pentium MMX (released in January 1997)
    I would not be surprised that it's the reason why "GPU-assisted decoding" has not been implemented. Clearly not a show-stopper, but still is a pain for some users.

    For me it's that and a pain if you're compiling and want to watch a video while you wait. People might have the same problem who use Blender or Tensorflow on their workstations. I agree with others it's also a waste, noise or power draw wise, if you're on mobile device.

    It has not prevented me from switching to chromium, however I have gone out of my way to watch videos in my Windows VM (VFIO) or other devices.

    Leave a comment:

Working...
X