Announcement

Collapse
No announcement yet.

AMD's UVD2-based XvBA Finally Does Something On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Kano, you're taking the thread off topic again. This is an XvBA thread, which means it's supposed to degenerate into ATI vs NVidia or open source vs binary driver slugging like every other thread here. Discussion of XvBA belongs in the NVidia threads

    Originally posted by TeoLinuX View Post
    I really dislike nVidia's lack of support to OSS developers. I think intel is giving a great support (let's see if they continue like that with Larrabee!). About AMD: I'm puzzled. They support OSS but I'd like to see them more convinced, maybe putting now and then their hands on the OSS drivers code.
    You're kidding, right ? We have three developers working full time on the open source drivers, although one of them (Cooper) works on supporting server customers with OSS drivers as top priority and you generally don't see that work.

    Alex, Cooper and Richard got the 6xx/7xx 3D engine working in the first place, Richard wrote most of the initial 3D driver code, then all of them worked on that code to the point where community developers were comfortable jumping in and fixing bugs themselves. Look through the commit history for mesa/src/mesa/drivers/dri/r600 to get a better idea.

    Alex has been a major contributor to the kernel modesetting code in drm. Check the recent commit messages there, including Linus's complaint about how many changes were being made

    Cooper has been working on video acceleration over Gallium3D.

    Richard just finished getting flow control instructions working on 6xx/7xx shader programs, and is cleaning up that code for release.

    Alex's work is visible all over the driver stack - drm, radeon and radeonhd drivers, and mesa, and has been the primary developer adding support for new GPUs to both radeon and radeonhd. You just need to look at the commit logs. He also wrote most of the EXA and Textured Video code used in both radeon and radeonhd.

    Sorry if this sounds like a rant, I was just a bit shocked that anyone could make a comment like that after all the work Alex, Richard and Cooper have done
    Last edited by bridgman; 17 November 2009, 12:58 PM.
    Test signature

    Comment


    • Originally posted by energyman View Post
      why is it inferior? I don't miss anything from nvidia. Games work
      There's a world of difference between "working," "working correctly" and "working well." See this other dude's thread about Oblivion still not working correctly.

      X is fast (with patched X to reverse the damage)
      To reverse the damage? The damage comes *with* the patch. Garbage upon opening windows.

      video works well
      Yeah sure. Especially V-Sync and VDPAU in mplayer.

      What is 'missing'?
      Video, V-Sync and games. Oh wait, you said those work. Bummer.

      Btw, reaaaallly smart move supporting a company that does not support open source solutions. Really. *applaud*.
      I'm not interested in open source. I'm interested in performance and features.

      Comment


      • Originally posted by RealNC View Post
        To reverse the damage? The damage comes *with* the patch. Garbage upon opening windows.
        Ancient history, dude

        The original (~4 year old) patch that was removed earlier in the year eliminated the readback delays but left the new memory uninitialized, which introduced corruption on Intel and other GPUs.

        One of our fglrx devs (Felix) created a new patch that properly initialized the new window area, eliminating the corruption and the need for a redundant readback of window contents from video memory to system memory. The patch is at post #228 on the original Ubuntu bug report.
        Last edited by bridgman; 17 November 2009, 12:11 PM.
        Test signature

        Comment


        • and Oblivion is a linux game? yes?
          if not - maybe, just maybe it is a wine problem?

          Comment


          • Originally posted by bridgman View Post
            Sorry if this sounds like a rant, I was just a bit shocked that anyone could make a comment like that after all the work Alex, Richard and Cooper have done
            John, it wasn't my intention to offend anyone, in previous posts I expressed my appreciation for the guys involved, even if I don't appreciate the result *so far*
            I admit I didn't know about such a direct involvement of AMD in OSS drivers!

            But this make me wonder: isn't it a duplicate effort? I'm just asking!
            I mean, referring only to the driver part (example: the shaders, not the kernel side), are they coding something completely different from the blob? Aren't they aware/allowed to borrow any part from the blob?

            Will OSS driver always be "a son of a lesser god"? Or, in case AMD guys + the community create a piece of code that works much better than the blob... will it ever enter the blob (provided it fits)? I don't think so, because of the GPL licence.

            This post is just about my doubts. Not implying anything. I'm just wondering, because it sound so weird TO ME that AMD funds 2 different driver projects in parallel. That's all

            EDIT: sorry for being off topic. This is my last post on the driver quality here. It was only that I associated the lack of XvBA on my 3650HD to the state of the drivers
            Last edited by TeoLinuX; 17 November 2009, 12:28 PM.

            Comment


            • Where is it? I only know about this one:

              https://bugs.launchpad.net/ubuntu/in...er/+bug/254468

              Edit:
              Never mind, found it. Will try ASAP.
              Last edited by RealNC; 17 November 2009, 12:36 PM.

              Comment


              • Originally posted by energyman View Post
                and Oblivion is a linux game? yes?
                if not - maybe, just maybe it is a wine problem?
                NVidia - the way it's meant to be played

                Comment


                • Originally posted by RealNC View Post
                  NVidia - the way it's meant to be played
                  Which may easily mean that it is, in fact, a WINE bug.
                  I don't know if you realize this, but wine is written around the nvidia blob, the blob features, AND the blob bugs...

                  Have you ever considered the possibility that a DEFECT in the nvidia driver resulted in wine WORKAROUNDS that cause the program to break on drivers that behave CORRECTLY?

                  In other words, the problem here is very likely to be the NVIDIA BLOB.


                  Anyhow, you seem to like nvidia an aweful lot -- your choice.
                  This thread isn't about nvidia, so maybe you might want to concentrate more on an nvidia-related thread and less about this one which is actually quite IRRELEVANT to YOU.

                  Comment


                  • Originally posted by TeoLinuX View Post
                    But this make me wonder: isn't it a duplicate effort? I'm just asking! I mean, referring only to the driver part (example: the shaders, not the kernel side), are they coding something completely different from the blob? Aren't they aware/allowed to borrow any part from the blob?
                    They are totally different code bases, although the open source devs often talk to the proprietary driver devs for info and examples of how to program the hardware. The proprietary drivers are there for one reason - to allow code sharing across multiple OSes so we don't have to duplicate development efforts across Windows, MacOS, Linux and others. Using common code gives Linux users access to 3D engine features and performance at the same time as other OSes, at the cost of making a driver which is big and is not a "natural" fit with the evolving Linux framework. All of the binary drivers share these pros and cons to a large extent.

                    The open source drivers are perhaps 100 times smaller than the binary drivers, but everything is written natively for the X/DRI environment used in Linux. That gives them a *different* set of advantages, primarily being able to evolve quickly with the common framework, and (increasingly) to be used in the development of new framework features. They also tend to "automatically" work with new kernel and X versions, since the developers implementing the X and kernel changes are probably *using* those drivers and will (a) immediately see problems, and (b) be able to immediately fix them.

                    Originally posted by TeoLinuX View Post
                    Will OSS driver always be "a son of a lesser god"? Or, in case AMD guys + the community create a piece of code that works much better than the blob... will it ever enter the blob (provided it fits)? I don't think so, because of the GPL licence.
                    The open source code is largely X11 licensed, like most of the Xorg drivers, and the X11 license is a BSD derivative not GPL. We can push code both ways (fglrx to OSS, OSS to fgrlx) but in general we don't because the underlying designs are so different.

                    Originally posted by TeoLinuX View Post
                    This post is just about my doubts. Not implying anything. I'm just wondering, because it sound so weird TO ME that AMD funds 2 different driver projects in parallel. That's all
                    Some markets require the performance and features which are currently only available in proprietary drivers. Other markets place a high value on drivers being open source, particularly since availability of open source drivers gives Xorg developers the ability to push ahead with dramatic improvements to the common graphics stack.

                    I sometimes think the biggest problem with open source is that users can see what is being done, and are forced to realize how big and time-consuming graphics work really is. With proprietary drivers most of the work can be done "in secret" so you only see the finished result and think "that must have been easy".
                    Last edited by bridgman; 17 November 2009, 12:56 PM.
                    Test signature

                    Comment


                    • Originally posted by droidhacker View Post
                      Which may easily mean that it is, in fact, a WINE bug.
                      I don't know if you realize this, but wine is written around the nvidia blob, the blob features, AND the blob bugs...

                      Have you ever considered the possibility that a DEFECT in the nvidia driver resulted in wine WORKAROUNDS that cause the program to break on drivers that behave CORRECTLY?

                      In other words, the problem here is very likely to be the NVIDIA BLOB.


                      Anyhow, you seem to like nvidia an aweful lot -- your choice.
                      This thread isn't about nvidia, so maybe you might want to concentrate more on an nvidia-related thread and less about this one which is actually quite IRRELEVANT to YOU.
                      Then don't ask me questions. Obviously if you do, I have to answer. Your choice.

                      Comment

                      Working...
                      X