Announcement

Collapse
No announcement yet.

Chrome 94 Beta Released With WebCodecs API Promoted, WebGPU Origin Trial

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Just bear in mind that while HW acceleration can usually be fine for regular video content, it's usually abysmal for anything related to realtime video. Encoders and decoders have bad latency, bad feedback and don't always obey the encoding parameters that are requested. And nowadays, most people will do a lot more video conferencing on their computer, so it's very important to vet each driver or hardware for quality before enabling it in a browser.

    It's something very tricky and it doesn't help that HW vendors rarely focus on RTC scenarios, so they sometimes end up shipping broken HW to their customers, that can't be worked around in software.

    Comment


    • #12
      Originally posted by s_j_newbury View Post


      One of the main reasons I use Firefox. It actually treats Linux (incl Wayland) as a first class platform despite the problems with the Mozilla organization.
      Firefox maybe, but Mozilla doesn't treat Linux as a first-class citizen per se. Just look at Mozilla's VPN service: not available on Linux an they don't know when it will be.

      Comment


      • #13
        Originally posted by Orphis View Post
        Just bear in mind that while HW acceleration can usually be fine for regular video content, it's usually abysmal for anything related to realtime video. Encoders and decoders have bad latency, bad feedback and don't always obey the encoding parameters that are requested. And nowadays, most people will do a lot more video conferencing on their computer, so it's very important to vet each driver or hardware for quality before enabling it in a browser.

        It's something very tricky and it doesn't help that HW vendors rarely focus on RTC scenarios, so they sometimes end up shipping broken HW to their customers, that can't be worked around in software.
        Well that sucks hard. I would have expected it to be much more efficient and thus faster than in software since that's basically the point. But thanks for the warning

        Comment


        • #14
          Ah yes because we need websites able to mine crypto via GPU now.

          Also we need to expose information about GPU/gpu compute capabilities to google.

          Comment


          • #15
            Originally posted by piotrj3 View Post
            Ah yes because we need websites able to mine crypto via GPU now.

            Also we need to expose information about GPU/gpu compute capabilities to google.
            Sure, progress is always evil. As is the use of more advanced and efficient APIs...

            Comment


            • #16
              Originally posted by piotrj3 View Post
              Ah yes because we need websites able to mine crypto via GPU now.
              You say that like it matters whether compute shaders are available matters when it comes to implementing crypto on the GPU. You already can do pretty much everything you can do with compute shaders with just plain old fragment shaders, if you're up to rewiring your algorithm a little. It's not like there's a way to rate limit fragment shader execution in WebGL, either. Plus, it's not like the compute shaders in WebGPU or Vulkan or such have anywhere near the same level of control and flexibility as something like CUDA or OpenCL, anyway.

              Originally posted by piotrj3 View Post
              Also we need to expose information about GPU/gpu compute capabilities to google.
              There's nothing in WebGPU that has it giving out any more hardware information than what WebGL already does, seeing as most if not all the parameters that could describe the hardware in more detail are hidden away and managed by the browser just as in WebGL. There's also no reason to believe that browsers would go out of their way to expose more detailed information for no gain at all.

              Comment


              • #17
                Originally posted by piotrj3 View Post
                Ah yes because we need websites able to mine crypto via GPU now.

                Also we need to expose information about GPU/gpu compute capabilities to google.
                Honestly prefer crypto over ads, a good way to support my favourite websites without annoying ads, assuming it is opt in anyway, but we all know how likely that will be.

                Comment


                • #18
                  Originally posted by Artim View Post
                  Sure, progress is always evil. As is the use of more advanced and efficient APIs...
                  This is unironically correct. Most web devs are evil. Therefore, these APIs will be mostly used for more advanced squandering of the user's limited battery energy, and more efficient fingerprinting and ad targeting.

                  Comment


                  • #19
                    Originally posted by yump View Post

                    This is unironically correct. Most web devs are evil. Therefore, these APIs will be mostly used for more advanced squandering of the user's limited battery energy, and more efficient fingerprinting and ad targeting.
                    Yeah...no. That's just BS. Fingerprinting is already more than good enough and only Brave is supposed to be able to actually counter that. Every other browser fails miserably. And why should anyone just run something via WebGPU just to waste users battery life? Besides the fact that that's probably even more efficient than running same through WebGL, why do you think they would start now? They don't do it now, why on the future? At least given your definition of "squandering of the user's limited battery energy" isn't as much nonsense as the rest of your comment. Tough that's probably very unlikely

                    Comment


                    • #20
                      Originally posted by Artim View Post

                      Yeah...no. That's just BS. Fingerprinting is already more than good enough and only Brave is supposed to be able to actually counter that. Every other browser fails miserably. And why should anyone just run something via WebGPU just to waste users battery life? Besides the fact that that's probably even more efficient than running same through WebGL, why do you think they would start now? They don't do it now, why on the future? At least given your definition of "squandering of the user's limited battery energy" isn't as much nonsense as the rest of your comment. Tough that's probably very unlikely
                      It's not that they are intentionally wasting battery energy. It's that wasting battery energy doesn't affect clickthroughs or conversion rates, so they don't care.

                      Comment

                      Working...
                      X