Announcement

Collapse
No announcement yet.

XvMC support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #51
    Originally posted by lamikr View Post
    Thats true. It would be nice to know what's the case with NVIDIA/VDPAU drivers. I mean what is the total power consumption of computer in case where you use 90% of your CPU for watching the h264 material compared to situation where the CPU load is only 7 % thanks for the VDPAU but the GPU is on the other hand doing much more work.
    Decoding video with NVIDIA's VP2/3 or AMD's UVD does not load the GPU much since the decoding unit is special-purpose hardware, separate from the GPU's stream processors. A passively cooled GeForce 8400 GS or Radeon HD 3450 decodes high-bitrate HD H.264 video without breaking a sweat. HD deinterlacing and other postprocessing may be a more stressing task.

    And is there any easy method/top kind of tool for watching the GPU utilization?
    I don't think there is for Linux, but monitoring the GPU temperature is a better way to estimate energy consumption.

    Comment


    • #52
      As another data point, I have an Opteron 185 and a Dvico Fusion HDTV tuner and a Radeon 9550, and 4GB of DDR-400. It's pretty much impossible to watch 1080i HDTV material on this machine under Linux, but it works fine under WindowsXP using DxVA. I don't know what you guys are viewing when you say "MPEG2 doesn't even use 5% of my CPU" but it totally swamps this box. 720p stuff is generally viewable without trouble, but it still keeps the CPU pretty busy.

      Anyway, the lack of XvMC on Radeon is one of the reasons I haven't switched my cable TV subscription from analog to HiDef yet (here in Los Angeles). It will be a long time before MPEG2 becomes irrelevant in the US...

      Comment


      • #53
        Originally posted by deneb View Post
        Decoding video with NVIDIA's VP2/3 or AMD's UVD does not load the GPU much since the decoding unit is special-purpose hardware, separate from the GPU's stream processors. A passively cooled GeForce 8400 GS or Radeon HD 3450 decodes high-bitrate HD H.264 video without breaking a sweat. HD deinterlacing and other postprocessing may be a more stressing task.


        I don't think there is for Linux, but monitoring the GPU temperature is a better way to estimate energy consumption.
        Well the 3450 will not do any of the decoding on the GPU for h264 yet so that would be up to your CPU to do all the decoding. A GeForce 8 series though can utilize vdpau even on insanely high bitrates and CABAC enabled.

        This is what the usage and temperature looks like on a Sempron LE-1150 coupled with a 8200 IGP based board with Big Buck Bunny encoded from the raw 1920 HD pngs at a constant bitrate of 48 mbit/s for the full 10 minutes of playback. As you can see cpu and gpu don't even break a sweat during playback when vdpau is used on the 8200 and you have 0 dropped frames. Attempting such on a ATI card with such a meager system right now as it stands is not possible. Usage for mpeg2 and VC-1 are on par as well.


        Comment


        • #54
          Originally posted by highlandsun View Post
          As another data point, I have an Opteron 185 and a Dvico Fusion HDTV tuner and a Radeon 9550, and 4GB of DDR-400. It's pretty much impossible to watch 1080i HDTV material on this machine under Linux, but it works fine under WindowsXP using DxVA. I don't know what you guys are viewing when you say "MPEG2 doesn't even use 5% of my CPU" but it totally swamps this box.
          They are viewing SD MPEG-2. For me, playing 16 Mbps 1080i30 MPEG-2 video with MPlayer loads one core of a 2 GHz Core 2 Duo (T7200) up to about 60 %. I use NVIDIA's binary blob with the laptop's GeForce Go 7700 GPU.

          Deinterlacing the video with a good software filter is much harder than MPEG-2 decoding, and my CPU is not up to the task with the current single-threaded Yadif (-vf yadif=3) implementation in MPlayer. FFmpeg's half-rate deinterlacer (-vf pp=fd) or simple bobbing (-vf tfields) adds about 10% to the CPU load.

          I also tried the same video on an Athlon XP 2400+ (GeForce 2 GTS, nv driver) and it utilizes the CPU up to 80 % with some spikes to 95 %. Deinterlacing was not possible without dropping frames.

          So, either you are dealing with bitrates much higher than 20 Mbps, your graphics driver has poor support for XVideo, your video playback software sucks or you are trying to use fancy software deinterlacing.

          Comment


          • #55
            About power consumption. Even if the cpu uses less, then the gpu uses more. While general cpus usually use max 95W, we have general gpus that can take 190W just on their own.

            So, are there any benchmarks on total power consumption with vdpau/some other accel and without? Does the increased gpu wattage offset the decreased cpu power usage?

            Comment


            • #56
              Originally posted by curaga View Post
              About power consumption. Even if the cpu uses less, then the gpu uses more. While general cpus usually use max 95W, we have general gpus that can take 190W just on their own.

              So, are there any benchmarks on total power consumption with vdpau/some other accel and without? Does the increased gpu wattage offset the decreased cpu power usage?
              Power consumption increase from idle compared with during playback with vdpau is extremely small. At the wall measurements for the total system showed an increase of about 11 watts (idle 55 watts to 66 watts during vdpau playback) in consumption tested with a Fluke 83 V with some of that no doubt being consumed with the increased load on the harddrives. CPU powerstate remained at it's lowest level. Temp values on the GPU support those findings.

              EDIT: I should add that playback attempts made without vdpau increased power consumption on the same system to 89 watts and dropped frames like mad.
              Last edited by deanjo; 04 January 2009, 03:52 PM.

              Comment


              • #57
                Originally posted by smitty3268 View Post
                I've heard several general complaints about this, but not a single hard number. I would have guessed a Pentium 3 would have enough juice for this - can you definitively state which CPU you have in your computer that isn't fast enough? If there are P4's that can't keep up that's one thing, but it's very different if it's just P2's that need to be upgraded. I don't really have an old computer lying around to test myself.
                My MythTV machine is a 1.6 GHz Duron with a GeForce 6200 using the nvidia drivers. I downloaded a 40-some-odd Mbps 1080p MPEG-2 HD test file and this machine can play it back using 65-85% CPU usage with XvMC enabled and bob deinterlacing. It cannot play it back at all without using XvMC (e.g. by using Xvideo.) My Athlon 64 X2 4200+ desktop can just barely play back the same file with using Xvideo with an ATi x1900GT- CPU usage was about 95% and there was not enough CPU horsepower left to do deinterlacing. My month-old laptop's Core 2 Duo T7250 with the GM45 chipset using Xvideo has about 100% CPU utilization and occasionally drops frames playing the same test file. It is the bare minimum of being watchable, and it's a new machine. I say we certainly do need MPEG-2 decode assist for HDTV, which is mostly MPEG-2.

                Personally, I would love XvMC support. But not at the expense of just about any other feature I can think of. It's got to be right at the very bottom of almost everyone's list. Hopefully someone can build some generic support into Gallium3D and we won't have to always keep waiting for specific drivers to get supported.

                I'm sure it sucks to be among the few who need it, but as had been said: the specs are available and someone just has to step up and do the work. That's the beauty of open source
                Here's my priority for features desired in any new driver:
                1. Recognizing the card and getting it to accelerate 2D.
                2. Getting the card to not brek suspend-to-RAM/suspend-to-disk.
                3. Getting Xvideo playback to work.
                4. Getting 3D OpenGL acceleration/DRI to function.
                5. MPEG-2 acceleration
                6. Getting 3D OpenGL acceleration/DRI to work _well_.
                7. MPEG-4/H.264/XviD/Ogg Theora decode acceleration
                8. to 99. Bug fixes
                100. VC-1/WMV/BluRay acceleration. Who gives a crap about MS's proprietary formats, a super-expensive optical disk format that can't even legally be played on the OS in almost all cases, and not to mention contend with a buttload of DRM in the process?

                Comment


                • #58
                  Using dedicated hardware for a specific task will always be more power efficient than using a general purpose hardware.
                  In modern gpus dedicated bitstream decoding silicon will be more power friendly than using the graphics engine or the cpu.

                  However, there is a disadvantage to dedicated hardware, in that it is nowhere near as programmable or flexible as the cpu. This essentially means that files have to be encoded with the right codec, in the right way. There are many x264 encoded video files that will not work on the gpu (depending on the options used), but will work on the cpu. It is also why VC-1 cannot be added to the 8xxx parts because they do not do VLC.

                  Comment


                  • #59
                    Originally posted by _txf_ View Post
                    It is also why VC-1 cannot be added to the 8xxx parts because they do not do VLC.
                    Some do support VC-1, All 8200/8300 IGP's and some 8400GS (have to be based on the G98) have the capability.

                    Comment


                    • #60
                      Originally posted by _txf_ View Post
                      However, there is a disadvantage to dedicated hardware, in that it is nowhere near as programmable or flexible as the cpu. This essentially means that files have to be encoded with the right codec, in the right way. There are many x264 encoded video files that will not work on the gpu (depending on the options used), but will work on the cpu.
                      That's why we want OpenCL supported soon You can run everything on the GPU that way, regardless of how it's been encoded. CoreAVC (a Windows DirectShow H.264/AVC decoder) will be supporting CUDA soon in order to allow decoding of every video on the GPU with no regards to how it's been encoded.

                      Comment

                      Working...
                      X