Announcement

Collapse
No announcement yet.

Intel Preparing For SVT-AV1 1.0 Video Encoder With More AVX2 Optimizations

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Intel Preparing For SVT-AV1 1.0 Video Encoder With More AVX2 Optimizations

    Phoronix: Intel Preparing For SVT-AV1 1.0 Video Encoder With More AVX2 Optimizations

    Back in January Intel engineers released SVT-AV1 0.9 with significant speed-ups to this open-source AV1 encoder while now as we roll into Q2, SVT-AV1 v1.0 is being readied for launch...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Is it possible to write code in C/C++ so it is properly vectorized with gcc/clang/MSVC
    As SVT-AV1 use intrinsics in optimizations. But I don't want to write code with intrinsics as it is really hard to read and understand

    Comment


    • #3
      Originally posted by miskol View Post
      Is it possible to write code in C/C++ so it is properly vectorized with gcc/clang/MSVC
      As SVT-AV1 use intrinsics in optimizations. But I don't want to write code with intrinsics as it is really hard to read and understand
      They use intrinsics because the optimizer in the compiler for C/C++ cannot do it, or cannot provide good enough performance, or its behavior is too unstable.

      Comment


      • #4
        Originally posted by NobodyXu View Post

        They use intrinsics because the optimizer in the compiler for C/C++ cannot do it, or cannot provide good enough performance, or its behavior is too unstable.
        I know. I used intrinsics in one project. But still it will be nice to have something that we can use and C/C++ compiler will use it in best possible way.
        for example DirectXMath.h or some other building block

        Comment


        • #5
          Originally posted by miskol View Post
          I know. I used intrinsics in one project. But still it will be nice to have something that we can use and C/C++ compiler will use it in best possible way.
          for example DirectXMath.h or some other building block
          Yeah I agree, intrinsics are indeed hard to read, ugly and sometimes not portable.

          After googling, I found https://github.com/xtensor-stack/xsimd for C++.

          Comment


          • #6
            Even at its best settings it produces horribly blurry output and kills details.

            Could work for you if you need to compress 4K content to something managable and small, but for 1080p and good image quality? Use VP9/H.265/H.264. Libaom (reference AV1 encoder) is insanely slow and isn't much better.

            Comment


            • #7
              Originally posted by birdie View Post
              Even at its best settings it produces horribly blurry output and kills details.

              Could work for you if you need to compress 4K content to something managable and small, but for 1080p and good image quality? Use VP9/H.265/H.264. Libaom (reference AV1 encoder) is insanely slow and isn't much better.
              Which versions did you tried that have the problems, exactly?

              Comment


              • #8
                Originally posted by birdie View Post
                Even at its best settings it produces horribly blurry output and kills details.

                Could work for you if you need to compress 4K content to something managable and small, but for 1080p and good image quality? Use VP9/H.265/H.264. Libaom (reference AV1 encoder) is insanely slow and isn't much better.
                At iso bitrate?

                Comment


                • #9
                  I tried to compress a 1080p source with a quality setting which produces a near transparent result, i.e. with no visible loss in quality. I did it maybe a year ago or even earlier. Not sure if it still holds true. MSU Video Codecs Comparison 2021 begs to differ.

                  Comment


                  • #10
                    Does grain synthesis work yet?

                    I've noticed everyone dismissing that. Which is stupid. Creatives put grain in films. Hell, they artificially add it even when shooting 35mm. We need to be able to preserve grain's overall appearance without driving the bitrate through the roof.

                    Comment

                    Working...
                    X