Announcement

Collapse
No announcement yet.

New developer guy with some questions

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    I see, it kind of makes sense in a way, but it still looks like a big mess (as is complication) just to support a new chip, but I think graphics card's driver programming is not for the faint of heart either, so it's probably not so bad.

    I can't wait for everything to just go through Gallium3D, I believe it will be a major revolution on linux graphics.

    As for video decoding, as GPUs get more and more general purpose, I don't think it will be that hard to implement video acceleration using complex shaders. I don't think Gallium3D would have to be modified that much, if at all... But that discussion belongs to another topic.

    Thank very much for all the info, and the fast responses

    Comment


    • #12
      actually video decoding is already being implemented on top of gallium3d for the nouveau driver that works on nvidia cards

      Comment


      • #13
        Hello everybody!

        I studied a little the xf86-video-ati driver in order to know how hardware acceleration is done.
        But I don't understand where theses accelerations are called in the stack you discribed.

        Originally posted by mdias View Post
        Code:
        Application --> OGL API --> Mesa --> Xorg --|
                                     |              |--> DRM (kernel driver?)
                                     |-----> DRI----|
        Someone know how acceleration are called?
        Thanks for yours answers.
        Kind regards

        Comment

        Working...
        X