Using NVIDIA's NVENC On Linux With FFmpeg

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • phoronix
    Administrator
    • Jan 2007
    • 67187

    Using NVIDIA's NVENC On Linux With FFmpeg

    Phoronix: Using NVIDIA's NVENC On Linux With FFmpeg

    With the new NVIDIA 346 Linux driver series NVENC support was made available for accelerated video encoding support under Linux...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
  • stqn
    Senior Member
    • Jan 2011
    • 672

    #2
    Too bad that NVENC, at least on a GTX 680, seems to produce extremely low quality encodes?
    (Based on this article in French: http://www.hardware.fr/focus/67/enco...quicksync.html and more specifically this comparison: http://www.hardware.fr/marc/h264nven...?inception720a (you can click on the buttons at the bottom to compare NVENC with x264 veryfast for example))

    Comment

    • xeekei
      Senior Member
      • Oct 2012
      • 870

      #3
      So with NVidia, it's two different APIs for encode and decode? NVENC and VDPAU? Why not just adopt VAAPI? I think AMD is doing that.

      Comment

      • efikkan
        Senior Member
        • Jun 2010
        • 436

        #4
        Originally posted by xeekei View Post
        So with NVidia, it's two different APIs for encode and decode? NVENC and VDPAU? Why not just adopt VAAPI? I think AMD is doing that.
        Because NVENC is designed to use dedicated encoding hardware in the GPU, unlike the previous encoder which utilizes CUDA.

        Comment

        • Szzz
          Phoronix Member
          • May 2013
          • 92

          #5
          I will still use x264

          Comment

          • r1348
            Senior Member
            • Jul 2007
            • 636

            #6
            Originally posted by xeekei View Post
            So with NVidia, it's two different APIs for encode and decode? NVENC and VDPAU? Why not just adopt VAAPI? I think AMD is doing that.
            AFAIK AMD uses OpenMAX for encode/decode and VDPAU for decode only, and VAAPI is Intel only, or did I miss some news about AMD?

            Comment

            • DarkFoss
              Senior Member
              • Mar 2007
              • 404

              #7
              Originally posted by r1348 View Post
              AFAIK AMD uses OpenMAX for encode/decode and VDPAU for decode only, and VAAPI is Intel only, or did I miss some news about AMD?
              He's probably refering to this article and the earlier one linked to it.
              Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
              Those who would give up Essential Liberty to purchase a little Temporary Safety,deserve neither Liberty nor Safety.
              Ben Franklin 1755

              Comment

              • pinguinpc
                Senior Member
                • Jun 2009
                • 918

                #8
                In my case works however needs nvEncodeAPI.h from NVENC Windows SDK and cuda SDK installed too

                For more information can see this

                Without NVENC

                Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.


                With NVENC



                Comment

                • xeekei
                  Senior Member
                  • Oct 2012
                  • 870

                  #9
                  Originally posted by DarkFoss View Post
                  He's probably refering to this article and the earlier one linked to it.
                  http://www.phoronix.com/scan.php?pag...tem&px=MTgwMjA
                  Yes. More cooperation with Intel could be key to counter Nvidia's dominance. Using the same API for both decode and encode only makes sense, and VAAPI is already very established.

                  Comment

                  • drSeehas
                    Senior Member
                    • May 2014
                    • 699

                    #10
                    Originally posted by efikkan View Post
                    Because NVENC is designed to use dedicated encoding hardware in the GPU, ...
                    VA API or OpenMAX aren't designed to use dedicated encoding hardware in the GPUs?

                    Comment

                    Working...
                    X