Announcement

Collapse
No announcement yet.

Work-In-Progress Porting Of GCN 1.0/1.1 UVD To AMDGPU DRM Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by starshipeleven View Post
    And this has any bearing with what I was talking about... because?
    Because people complained about DAL, when it's just the tip of the iceberg. There's tons of magic numbers in AMD open-source code for graphics

    Comment


    • #32
      Originally posted by geearf View Post

      Yes it is, display code though is not
      The move is unlikely to happen as long as it results in a severely reduced feature set, really. If DC/DAL is not going in, these features would need to be ported by someone, most like non-AMD dev.

      Comment


      • #33
        Originally posted by timofonic View Post
        Because people complained about DAL, when it's just the tip of the iceberg. There's tons of magic numbers in AMD open-source code for graphics
        FYI: I was talking about adding/removing/validating headers on binary blobs loaded by the driver to run the decoder/encoder subsystem, not DAL.

        Comment


        • #34
          Originally posted by nanonyme View Post

          The move is unlikely to happen as long as it results in a severely reduced feature set, really. If DC/DAL is not going in, these features would need to be ported by someone, most like non-AMD dev.
          Again, display code does not matter for these older generations of cards. Whether we have full functionality or not is unrelated to it getting merged, but to devs (AMD or others) porting stuff from radeon over, such as in this very article.

          Comment


          • #35
            Originally posted by PuckPoltergeist View Post
            Even Jaguar-based systems will do this. Excavator really doesn't have a problem with this.
            Jaguar cores may handle it most of the time, but a High profile, CABAC-using h.264 Full HD file may bog it down in some frames that use heavy post-processing (I don't mean your run-of-the-mill h.264 files ripped off a BD).

            Comment


            • #36
              Am I the only one having trouble following this thread ?
              Test signature

              Comment


              • #37
                AMDGPU should really support all GCN cards for all features as there are several laptops with 2 gfx chips from different generations. Mixing radeon with amdgpu driver is no good idea.

                Comment


                • #38
                  Originally posted by Kano View Post
                  AMDGPU should really support all GCN cards for all features as there are several laptops with 2 gfx chips from different generations. Mixing radeon with amdgpu driver is no good idea.
                  Well, sometimes maybe. But most laptops I've seen that come with 2 GPU's had one r600 and one SI. So in that case, and I'd suspect that's most of them, you'd want both cards on the radeon driver. No doubt in my mind that's the more common scenario.

                  Comment


                  • #39
                    Originally posted by mitch074 View Post

                    Try dropping CPU use from 50% to 3-5%; that's 20 to 40W saving. Another use case is a media PC, mini PC or cheap laptop where the CPU is a low-frequency Excavator that just can't handle it on CPU alone.
                    The last mini-pc that I had which had trouble handling some HD content had an Atom330, that's a dual core atom running at 1.6Ghz. So yes for that I used the the GPU accelerator (nVidia Ion). But that's a 10 year old platform, with the worst type of processor. I just tried on my current pc, 1080p 60 fps video, it went from 19% to 10% CPU use. But my GPU went from 300 MHz to 1237 MHz, and from 11-18W to 30-60W.

                    Comment


                    • #40
                      Originally posted by AndyChow View Post

                      The last mini-pc that I had which had trouble handling some HD content had an Atom330, that's a dual core atom running at 1.6Ghz. So yes for that I used the the GPU accelerator (nVidia Ion). But that's a 10 year old platform, with the worst type of processor. I just tried on my current pc, 1080p 60 fps video, it went from 19% to 10% CPU use. But my GPU went from 300 MHz to 1237 MHz, and from 11-18W to 30-60W.
                      I'm not sure how you could check, but if you try playing back such a video which is CABAC-encoded and with native h.264 post-processing not disabled (VLC for example would allow you to do this), you would probably get jumped frames on details-heavy scenes. OTOH, I've yet to own a Jaguar-based PC; I just estimated that such a core, with a low max core speed, would have trouble with such a video.
                      Still, thank you for demonstrating my original point, that using the accelerator shaved 40W off. That's more than a modern laptop uses on moderate load.

                      Comment

                      Working...
                      X