Announcement

Collapse
No announcement yet.

Firefox 71 Landing Wayland DMA-BUF Textures Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by cl333r View Post
    I agree but don't agree at all with the mostly irrelevant on desktop part because I hate it when my CPU starts making more noise
    Your heatsink sucks, get a decent one. I buy heatsinks to be silent enough while under load.

    Comment


    • #32
      Originally posted by starshipeleven View Post
      Your heatsink sucks, get a decent one. I buy heatsinks to be silent enough while under load.
      I won't camouflage the problem by buying a heatsink and hoping it will actually work as advertised. I just don't have this mindset, instead I prefer to use the right tool for the job and the right one for video decoding and presenting is the video card, I'm sorry to break it to you.

      Comment


      • #33
        Funny how this "Why no hardware video decoding?" Question/comment comes up EVERY time there's a new feature pushed to Firefox or chrome. I wholeheartedly agree that it is a much needed feature, look at even current year SBCs that can barely handle 8bit 1080p h265 in software (with intrinsics).

        As discussed before on the many other threads here, it used to boil down to the fact that the Linux hardware video decoding standards were/are very loose and the route you took was largely hardware vendor dependant (barring any wrappers). Now they're trying to push V4L2 as the standard, especially since it has mem2mem and is more modular and vendor agnostic.

        If it were at the bottom of their priority list, we wouldn't have the progress we have now. I don't think we're far off from a working and reliable hardware video decoding implementation in Firefox and chrome, especially since V4L2 work has improved drastically in the past few years.

        More on topic, this Wayland feature is a VERY good feature to have for the future, it alone means less power consumption by not needing to make extra copies to/from GPU to change/display each frame. DMA_BUF is an amazing concept and since they started implementing it all through the Linux world behind the scenes (especially in kernel), memory efficiency has skyrocketed. Why copy if you can reuse the same buffers?

        Comment


        • #34
          Originally posted by cl333r View Post
          I won't camouflage the problem
          I'm not saying a bigger heatsink is the solution to media decoding problems (as for the most common form of modern PC, aka laptops it's not possible).
          I'm just saying that if you hear your desktop PC fan ramping up when it's doing something as basic as media decoding you are using your desktop very wrong, that heatsink is seriously underpowered.

          One of the last remaining points in favor of losing portability and getting a static PC in the modern era is that you can give it quiet (and bulkier) heatsinks so you can keep hammering it with heavy loads without hearing screeching noises. Why aren't you doing that? Just to prove a point on internet forums?
          Last edited by starshipeleven; 08 October 2019, 05:54 PM.

          Comment


          • #35
            Originally posted by starshipeleven View Post
            I'm just saying that if you hear your desktop PC fan ramping up when it's doing something as basic as media decoding you are using your desktop very wrong, that heatsink is seriously underpowered.
            10 bit 1080p HEVC video isn't an easy thing, I'm not watching old MPEG2 movies mind you.
            And I don't trust heatsinks I prefer a quiet CPU fan because it can actually accelerate when the CPU is fully loaded so it's not overheating. Only the monitor with the mouse/keyboard are in the room, the case itself is inside a wardrobe so normally I don't hear the CPU fan at all.

            Comment


            • #36
              Originally posted by cl333r View Post
              10 bit 1080p HEVC video isn't an easy thing
              Anything not 4k is basic for modern desktop hardware. Youy could software-decode 4k with an i5 like 3 years ago or more.

              And I don't trust heatsinks
              The heatsink is the metal component between the fan and the CPU. You always have a heatsink. Someone (like me) calls "heatsink" the metal piece AND the fan, because that's an "active heatsink".

              the case itself is inside a wardrobe so normally I don't hear the CPU fan at all.
              Then I don't understand wtf you were saying a few posts above. " I hate it when my CPU starts making more noise"
              Is this noise in your head only?

              Meanwhile, I have the case under the desk and it's compiling stuff and I watch youtube and whatever and I don't hear it
              Last edited by starshipeleven; 08 October 2019, 06:56 PM.

              Comment


              • #37
                Originally posted by starshipeleven View Post
                Anything not 4k is basic for modern desktop hardware.
                No. We wouldn't even agree on what exactly is "modern" and what is not, hell I can't even decide for myself, nevermind agreeing with your opinion.
                Originally posted by starshipeleven View Post
                Then I don't understand wtf you were saying a few posts above. " I hate it when my CPU starts making more noise"
                There's this thing that when the CPU is under heavier load the CPU fan spins faster and makes more noise (that I can actually hear), I thought it's obvious.

                Comment


                • #38
                  Originally posted by cl333r View Post
                  No.
                  yes. Disagreeing does not make you right.

                  There's this thing that when the CPU is under heavier load the CPU fan spins faster and makes more noise (that I can actually hear), I thought it's obvious.
                  And it's so loud that you hear it from inside a wardrobe too? It's worse than I thought. What is that, a rack server?

                  Comment


                  • #39
                    Originally posted by atomsymbol

                    Some notes:
                    • Nowadays 3 CPU cores with AVX2 are able to decode any video type up to 4K 10-bit HDR 60Hz, the CPU might only have issues handling 8K videos
                      • GPU-assisted decoding is preferable when it results in smaller system power consumption compared to CPU-only video decoding or when the CPU is busy handling other tasks in addition to video decoding
                      • Decoded 4K 10-bit HDR 60Hz requires about 1-2 GiB/s of memory bandwidth. Main memory bandwidth and PCI Express bandwidth are greater than 2 GiB/s.
                    • From historical perspective, HW acceleration of video decoding in x86 CPUs started with the Pentium MMX (released in January 1997)
                    I would not be surprised that it's the reason why "GPU-assisted decoding" has not been implemented. Clearly not a show-stopper, but still is a pain for some users.

                    For me it's that and a pain if you're compiling and want to watch a video while you wait. People might have the same problem who use Blender or Tensorflow on their workstations. I agree with others it's also a waste, noise or power draw wise, if you're on mobile device.

                    It has not prevented me from switching to chromium, however I have gone out of my way to watch videos in my Windows VM (VFIO) or other devices.

                    Comment


                    • #40
                      Originally posted by starshipeleven View Post
                      yes. Disagreeing does not make you right.


                      And it's so loud that you hear it from inside a wardrobe too? It's worse than I thought. What is that, a rack server?
                      If you don't have bad hearing you don't need much noise to hear it, isn't it?

                      Comment

                      Working...
                      X