Announcement

Collapse
No announcement yet.

Firefox 71 Landing Wayland DMA-BUF Textures Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • GrayShade
    replied
    I made a quick test with an 1080 H.246 video I have on hand. I'm running mpv on Gnome Wayland on a Skylake i7-6700HQ (45 W TDP) laptop. It uses 21 W in idle, 25 W, 10% CPU, with hardware decoding (iGPU), 31 W 70% CPU with software decoding.

    I also tried with a 10-bit 4K HEVC and it's unwatchable with both software and hardware decoding at 58 W (powersave governor) / 65 W (performance). I haven't tested on Windows.
    Last edited by GrayShade; 11 October 2019, 04:49 PM.

    Leave a comment:


  • pal666
    replied
    Originally posted by atomsymbol
    [*]Nowadays 3 CPU cores with AVX2
    not every cpu sold nowadays is avx2

    Leave a comment:


  • aufkrawall
    replied
    Originally posted by atomsymbol
    The meaning is obvious.
    Ah, yes? Did you even specify the used CPU?
    If it's the 6700k with HTT, it would likely mean physical cores barely have more than one quarter of computing power unutilized, which completely backs my point.

    Leave a comment:


  • aufkrawall
    replied
    Originally posted by atomsymbol
    CPU utilization is below 300% in both cases.
    What is "300%" supposed to mean? CPU total usage also is questionable with hyperthreading enabled.
    Anyhow, it's the peak bitrate that's crucial, and that YouTube 4k 60fps VP9 8 bit video caused dropped frames for me with my previous 2500k OC in Firefox.
    Not to speak of mobile devices, noise, heat and limited background capatibilities due to expensive software decoding.

    Leave a comment:


  • cl333r
    replied
    Originally posted by starshipeleven View Post
    yes. Disagreeing does not make you right.


    And it's so loud that you hear it from inside a wardrobe too? It's worse than I thought. What is that, a rack server?
    If you don't have bad hearing you don't need much noise to hear it, isn't it?

    Leave a comment:


  • Jabberwocky
    replied
    Originally posted by atomsymbol

    Some notes:
    • Nowadays 3 CPU cores with AVX2 are able to decode any video type up to 4K 10-bit HDR 60Hz, the CPU might only have issues handling 8K videos
      • GPU-assisted decoding is preferable when it results in smaller system power consumption compared to CPU-only video decoding or when the CPU is busy handling other tasks in addition to video decoding
      • Decoded 4K 10-bit HDR 60Hz requires about 1-2 GiB/s of memory bandwidth. Main memory bandwidth and PCI Express bandwidth are greater than 2 GiB/s.
    • From historical perspective, HW acceleration of video decoding in x86 CPUs started with the Pentium MMX (released in January 1997)
    I would not be surprised that it's the reason why "GPU-assisted decoding" has not been implemented. Clearly not a show-stopper, but still is a pain for some users.

    For me it's that and a pain if you're compiling and want to watch a video while you wait. People might have the same problem who use Blender or Tensorflow on their workstations. I agree with others it's also a waste, noise or power draw wise, if you're on mobile device.

    It has not prevented me from switching to chromium, however I have gone out of my way to watch videos in my Windows VM (VFIO) or other devices.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by cl333r View Post
    No.
    yes. Disagreeing does not make you right.

    There's this thing that when the CPU is under heavier load the CPU fan spins faster and makes more noise (that I can actually hear), I thought it's obvious.
    And it's so loud that you hear it from inside a wardrobe too? It's worse than I thought. What is that, a rack server?

    Leave a comment:


  • cl333r
    replied
    Originally posted by starshipeleven View Post
    Anything not 4k is basic for modern desktop hardware.
    No. We wouldn't even agree on what exactly is "modern" and what is not, hell I can't even decide for myself, nevermind agreeing with your opinion.
    Originally posted by starshipeleven View Post
    Then I don't understand wtf you were saying a few posts above. " I hate it when my CPU starts making more noise"
    There's this thing that when the CPU is under heavier load the CPU fan spins faster and makes more noise (that I can actually hear), I thought it's obvious.

    Leave a comment:


  • starshipeleven
    replied
    Originally posted by cl333r View Post
    10 bit 1080p HEVC video isn't an easy thing
    Anything not 4k is basic for modern desktop hardware. Youy could software-decode 4k with an i5 like 3 years ago or more.

    And I don't trust heatsinks
    The heatsink is the metal component between the fan and the CPU. You always have a heatsink. Someone (like me) calls "heatsink" the metal piece AND the fan, because that's an "active heatsink".

    the case itself is inside a wardrobe so normally I don't hear the CPU fan at all.
    Then I don't understand wtf you were saying a few posts above. " I hate it when my CPU starts making more noise"
    Is this noise in your head only?

    Meanwhile, I have the case under the desk and it's compiling stuff and I watch youtube and whatever and I don't hear it
    Last edited by starshipeleven; 08 October 2019, 06:56 PM.

    Leave a comment:


  • cl333r
    replied
    Originally posted by starshipeleven View Post
    I'm just saying that if you hear your desktop PC fan ramping up when it's doing something as basic as media decoding you are using your desktop very wrong, that heatsink is seriously underpowered.
    10 bit 1080p HEVC video isn't an easy thing, I'm not watching old MPEG2 movies mind you.
    And I don't trust heatsinks I prefer a quiet CPU fan because it can actually accelerate when the CPU is fully loaded so it's not overheating. Only the monitor with the mouse/keyboard are in the room, the case itself is inside a wardrobe so normally I don't hear the CPU fan at all.

    Leave a comment:

Working...
X