Announcement

Collapse
No announcement yet.

Work-In-Progress Porting Of GCN 1.0/1.1 UVD To AMDGPU DRM Driver

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by AndyChow View Post

    The last mini-pc that I had which had trouble handling some HD content had an Atom330, that's a dual core atom running at 1.6Ghz. So yes for that I used the the GPU accelerator (nVidia Ion). But that's a 10 year old platform, with the worst type of processor. I just tried on my current pc, 1080p 60 fps video, it went from 19% to 10% CPU use. But my GPU went from 300 MHz to 1237 MHz, and from 11-18W to 30-60W.
    I'm not sure how you could check, but if you try playing back such a video which is CABAC-encoded and with native h.264 post-processing not disabled (VLC for example would allow you to do this), you would probably get jumped frames on details-heavy scenes. OTOH, I've yet to own a Jaguar-based PC; I just estimated that such a core, with a low max core speed, would have trouble with such a video.
    Still, thank you for demonstrating my original point, that using the accelerator shaved 40W off. That's more than a modern laptop uses on moderate load.

    Comment


    • #42
      Originally posted by mitch074 View Post

      I'm not sure how you could check, but if you try playing back such a video which is CABAC-encoded and with native h.264 post-processing not disabled (VLC for example would allow you to do this), you would probably get jumped frames on details-heavy scenes. OTOH, I've yet to own a Jaguar-based PC; I just estimated that such a core, with a low max core speed, would have trouble with such a video.
      Still, thank you for demonstrating my original point, that using the accelerator shaved 40W off. That's more than a modern laptop uses on moderate load.
      You might have looked the numbers in reverse. Using the accelerator increased my power use by about 35W, GPU wise. The CPU, let's say, decreased 10% (doubtful), saved 8.5W, so overall power use went up net 26.5W.

      But I admit I didn't measure drawn at the wall.

      Comment


      • #43
        Originally posted by AndyChow View Post

        You might have looked the numbers in reverse. Using the accelerator increased my power use by about 35W, GPU wise. The CPU, let's say, decreased 10% (doubtful), saved 8.5W, so overall power use went up net 26.5W.

        But I admit I didn't measure drawn at the wall.
        Ah - either I read it backwards or it isn't too clear. Then yes, I mean at the wall - my living room PC uses an oldie Athlon X4 (underclocked to 2.0 GHz, undervolted) and a GCN 1.0 video card (RadeonHD 7770 1Gb); I've tried playing back a BD backup (FullHD, 30fps, h.264 High profile 4.2 CABAC-encoded) on it, both with and without video acceleration. The fans on both CPU and GPU are pretty much inaudible at rest (I need complete silence in the room to hear it after it's been sitting idle a couple hours in the middle of summer). I used VLC for both as it enables the use of a hardware decompressor with a flick of an option and is otherwise identical in all other aspects. Desktop environment is Xfce with compositing disabled.
        The result is that CPU-only decode loads all 4 cores by 30-70%, with an audible fan speed increase and CPU temp increase. GPU decode on the same movie segment keeps the CPU load way under 10% (at minimum clock speed), for sound and VLC's overlay (which can be quite heavy when compared with other media players); GPU load goes up noticeably but fan speed and GPU temp stay very low (that same GPU fan screams something awful when I run, say, Unigine Heaven on it). I have yet to do a direct measure, but last time I did a watt measurement for this system fully loaded and overclocked it pulled more than 300W at the plug (the CPU was overclocked). Judging by the temperatures it reaches now, I would guess that video playback remains under 100W in GPU decode while CPU decode is closer to 150W.

        Comment


        • #44
          Originally posted by mitch074 View Post

          Ah - either I read it backwards or it isn't too clear. Then yes, I mean at the wall - my living room PC uses an oldie Athlon X4 (underclocked to 2.0 GHz, undervolted) and a GCN 1.0 video card (RadeonHD 7770 1Gb); I've tried playing back a BD backup (FullHD, 30fps, h.264 High profile 4.2 CABAC-encoded) on it, both with and without video acceleration. The fans on both CPU and GPU are pretty much inaudible at rest (I need complete silence in the room to hear it after it's been sitting idle a couple hours in the middle of summer). I used VLC for both as it enables the use of a hardware decompressor with a flick of an option and is otherwise identical in all other aspects. Desktop environment is Xfce with compositing disabled.
          The result is that CPU-only decode loads all 4 cores by 30-70%, with an audible fan speed increase and CPU temp increase. GPU decode on the same movie segment keeps the CPU load way under 10% (at minimum clock speed), for sound and VLC's overlay (which can be quite heavy when compared with other media players); GPU load goes up noticeably but fan speed and GPU temp stay very low (that same GPU fan screams something awful when I run, say, Unigine Heaven on it). I have yet to do a direct measure, but last time I did a watt measurement for this system fully loaded and overclocked it pulled more than 300W at the plug (the CPU was overclocked). Judging by the temperatures it reaches now, I would guess that video playback remains under 100W in GPU decode while CPU decode is closer to 150W.
          I really need to buy a wall watt measure device, I see they are about $20. This has got me really curious.

          At least with a GPU encoder asic, I understand for streaming or screen recording, you don't want to use the CPU. Decoding, you're mostly watching a movie, so it shouldn't matter.

          I've tested the decoder a lot these last few days with my radeon rx 480, at least it seems to work fine. When I had a radeon 6850, I remember that UVD3 would often show a green bar at the bottom, fail to play the video at all, randomly crash, have lag and/or artifacts, and switching to software decode always fixed the problem. I'm not sure if it's VLC implementation or the decoder version, but everything seems to "just work" now.

          If I actually measure real power draw, I'll report back my real at-the-wall numbers.

          Comment


          • #45
            Originally posted by AndyChow View Post

            I really need to buy a wall watt measure device, I see they are about $20. This has got me really curious.

            At least with a GPU encoder asic, I understand for streaming or screen recording, you don't want to use the CPU. Decoding, you're mostly watching a movie, so it shouldn't matter.

            I've tested the decoder a lot these last few days with my radeon rx 480, at least it seems to work fine. When I had a radeon 6850, I remember that UVD3 would often show a green bar at the bottom, fail to play the video at all, randomly crash, have lag and/or artifacts, and switching to software decode always fixed the problem. I'm not sure if it's VLC implementation or the decoder version, but everything seems to "just work" now.

            If I actually measure real power draw, I'll report back my real at-the-wall numbers.
            The green corruption was a VLC bug. Actually it was a naive assumption about how the decoder pipeline worked. VLC fixed it.

            Comment


            • #46
              Originally posted by AndyChow View Post
              At least with a GPU encoder asic, I understand for streaming or screen recording, you don't want to use the CPU. Decoding, you're mostly watching a movie, so it shouldn't matter.
              Depends on whether you want to watch a movie with or without background noise - 40 less watts of extra heat to evacuate can easily make a difference in whether the fan spins fast or not.

              Comment

              Working...
              X