Announcement

Collapse
No announcement yet.

Using NVIDIA's NVENC On Linux With FFmpeg

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by tuke81 View Post
    Have you tried different presets? i.e. -c:v nvenc -preset llhq. Presets are
    hq = high quality (I think libnvenc uses this)
    hp = high performance
    bd = Bluray Disk
    ll = low latency
    llhq = low latency high quality
    llhp = low latency high performance
    default = between llhq and llhp, I suppose it's same as ll

    You can also use 2 pass with low latency presets too, with option -2pass 1. I'm not sure are all presets supported by kepler cards though.
    I test llhp and hp and quality still very similar, however nvenc go up bitrate meanwhile time are passed for example begins around 3000k but at 10 to 15 bitrate up considerably normally stay around 5000k to 6000k

    Quality stay similar to actual bitrate 5750k for lastest videos

    Maybe more later upload some videos about this

    Comment


    • #32
      As commented before this is some tests using low latency high quality and high quality preset compared to actual bitrate used 5750k


      NVENC 5750k


      NVENC High Quality Preset




      NVENC Low Latency High Quality Preset


      Comment


      • #33
        Does anyone have 2nd generation maxwell(gtx980/970/960), nvenc 5.0 sdk have now h265 support. Well I'm quite sure it won't work with ffmpeg yet but I'm a bit curious how it works with samples bundled with sdk.

        Comment


        • #34
          Originally posted by tuke81 View Post
          Does anyone have 2nd generation maxwell(gtx980/970/960), nvenc 5.0 sdk have now h265 support. Well I'm quite sure it won't work with ffmpeg yet but I'm a bit curious how it works with samples bundled with sdk.

          I've tested the latest ffmpeg version (git) with a Maxwell Gen 2 GTX 980M and NVENC for H.264 and H.265/HEVC is working just fine.

          Comment


          • #35
            This thread was helpful to me in getting FFmpeg compiled on Fedora 22. I posted initial results of the CPU offload from a 2.7K source video and FPS increase by using nvenc over libx264. Impressive.
            Last edited by cacasodo; 25 May 2016, 02:02 PM.

            Comment


            • #36
              i get much much more fps with my 960. I have Encoded a 1080p with over 200 fps.

              Code:
              ffmpeg -i "input(1080p).mp4" -map 0 -loglevel repeat+verbose -c copy -c:v nvenc -preset slow -movflags +faststart "output-nvenc.mp4"
              And the same wile with hevc/h265 around 90 fps

              Code:
              ffmpeg -i "input(1080p).mp4" -map 0 -loglevel repeat+verbose -c copy -c:v nvenc_hevc -preset slow -movflags +faststart "output-nvenc_hevc.mp4"
              Each one that is interested in a ffmpeg build for windows with it, can use this scrips to build it. https://github.com/jb-alvarado/media-autobuild_suite This Download everything needed an build some a/v tools from git.

              i would provide my builds but due to all the included stuff its not distributable.
              Last edited by Nille; 25 May 2016, 01:56 PM.

              Comment


              • #37
                Hi Nille,
                I should have made it more clear that my source is a much larger 2.7K video from a GoPro, not the 1080P content that you're transcoding.

                That being said, I'll try a 1080P render to see what I get.
                'sodo

                Comment


                • #38
                  i download a uhd sample right now and see if it is a big impact (ofc there is a impact)

                  EDIT: So UHD 3840x2160 are encoded with hevc with around 27 fps and with h264 33fps. I can test it also with quicksync
                  Uh, but the quality is low. I test it with another preset slow = horrible
                  EDIT2: hq looks better but still low. ok the source file is over 1GB and the output has only 30MB. He goes to 2000 kbit for both per default. Without the Preset the encoding is a bit faster, around 40fps with uhd

                  PS: qsv is not starting but eat up one core without progress
                  Last edited by Nille; 25 May 2016, 04:13 PM.

                  Comment


                  • #39
                    Interesting. I'll play around with some of the options later on as well. Here is what the ffmpeg help states about the switches we have access to:

                    Code:
                    [sodo@box ~]$ ffmpeg -h encoder=nvenc
                    ffmpeg version 2.6.8 Copyright (c) 2000-2016 the FFmpeg developers
                      built with gcc 5.3.1 (GCC) 20151207 (Red Hat 5.3.1-2)
                    ..
                    Encoder nvenc [Nvidia NVENC h264 encoder]:
                        Threading capabilities: no
                        Supported pixel formats: nv12
                    nvenc AVOptions:
                      -preset            <string>     E..V.... Set the encoding preset (one of hq, hp, bd, ll, llhq, llhp, default) (default "hq")
                      -cbr               <int>        E..V.... Use cbr encoding mode (from 0 to 1) (default 0)
                      -2pass             <int>        E..V.... Use 2pass cbr encoding mode (low latency mode only) (from -1 to 1) (default -1)
                      -gpu               <int>        E..V.... Selects which NVENC capable GPU to use. First GPU is 0, second is 1, and so on. (from 0 to INT_MAX) (default 0)

                    Comment


                    • #40
                      You can also try nvenc_hevc for h265. But i see a difference. On Windows i have more pixel formats [Supported pixel formats: yuv420p nv12 yuv444p] and i can set a profile etc. Im Using the latest git code.

                      Code:
                      Encoder nvenc [NVIDIA NVENC h264 encoder]:
                          General capabilities: delay
                          Threading capabilities: none
                          Supported pixel formats: yuv420p nv12 yuv444p
                      nvenc AVOptions:
                        -preset            <string>     E..V.... Set the encoding preset (one of slow = hq 2pass, medium = hq, fast = hp, hq, hp, bd, ll, llhq, llhp, lossless, losslesshp, default) (default "medium")
                        -profile           <string>     E..V.... Set the encoding profile (high, main, baseline or high444p) (default "main")
                        -level             <string>     E..V.... Set the encoding level restriction (auto, 1.0, 1.0b, 1.1, 1.2, ..., 4.2, 5.0, 5.1) (default "auto")
                        -tier              <string>     E..V.... Set the encoding tier (main or high) (default "main")
                        -cbr               <boolean>    E..V.... Use cbr encoding mode (default false)
                        -2pass             <boolean>    E..V.... Use 2pass encoding mode (default auto)
                        -gpu               <int>        E..V.... Selects which NVENC capable GPU to use. First GPU is 0, second is 1, and so on. (from 0 to INT_MAX) (default 0)
                        -delay             <int>        E..V.... Delays frame output by the given amount of frames. (from 0 to INT_MAX) (default INT_MAX)
                      Code:
                      Encoder nvenc_hevc [NVIDIA NVENC hevc encoder]:
                          General capabilities: delay
                          Threading capabilities: none
                          Supported pixel formats: yuv420p nv12 yuv444p
                      nvenc_hevc AVOptions:
                        -preset            <string>     E..V.... Set the encoding preset (one of slow = hq 2pass, medium = hq, fast = hp, hq, hp, bd, ll, llhq, llhp, lossless, losslesshp, default) (default "medium")
                        -profile           <string>     E..V.... Set the encoding profile (high, main, baseline or high444p) (default "main")
                        -level             <string>     E..V.... Set the encoding level restriction (auto, 1.0, 1.0b, 1.1, 1.2, ..., 4.2, 5.0, 5.1) (default "auto")
                        -tier              <string>     E..V.... Set the encoding tier (main or high) (default "main")
                        -cbr               <boolean>    E..V.... Use cbr encoding mode (default false)
                        -2pass             <boolean>    E..V.... Use 2pass encoding mode (default auto)
                        -gpu               <int>        E..V.... Selects which NVENC capable GPU to use. First GPU is 0, second is 1, and so on. (from 0 to INT_MAX) (default 0)
                        -delay             <int>        E..V.... Delays frame output by the given amount of frames. (from 0 to INT_MAX) (default INT_MAX)

                      Comment

                      Working...
                      X