Announcement

Collapse
No announcement yet.

Nouveau Developer Working On OpenGL Extension To Help With Reverse-Engineering

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Nouveau Developer Working On OpenGL Extension To Help With Reverse-Engineering

    Phoronix: Nouveau Developer Working On OpenGL Extension To Help With Reverse-Engineering

    Longtime open-source NVIDIA "Nouveau" driver developer Ilia Mirkin is drafting a new OpenGL extension proposal for helping out in driver reverse-engineering efforts...

    http://www.phoronix.com/scan.php?pag...bug_operations

  • #2
    Will this be an optional extension for now, with hopes that it is later introduced into the full OpenGL spec?

    Will this be included in WebGL? If you're able to poke random registers on the GPU and see the results, it sounds like you're increasing your attack surface quite a lot.

    Comment


    • #3
      Since Redhat have such power in project, know with IBM why they don't try to push better support from nvidia, at least some power support or simple say we will stop supporting nvidia in our servers/workstations and only support amd and intel since we only work with open source

      Comment


      • #4
        Originally posted by andre30correia View Post
        Since Redhat have such power in project, know with IBM why they don't try to push better support from nvidia, at least some power support or simple say we will stop supporting nvidia in our servers/workstations and only support amd and intel since we only work with open source
        Knowing NVIDIA, they can wait it out far longer than RedHat or IBM or anyone else can. This type of strategy won't succeed.

        Comment


        • #5
          Considering that both AMD and nVidia's proprietary drivers are considerably more buggy than any open source graphics driver, and that they both have severe open bugs -decades- old.... I mean -anybody- who's used nvidia's driver with multiple monitors on linux knows exactly what I'm talking about, and that's not even considering all the non-standard behavior in OpenGL...

          Comment


          • #6
            Originally posted by starshipeleven View Post
            Knowing NVIDIA, they can wait it out far longer than RedHat or IBM or anyone else can. This type of strategy won't succeed.
            If Intel is actually going to go through with their GPU plans, wouldn't that put even more pressure on NVIDIA though?

            Specially since Intel will probably prioritize server needs over gamer needs.

            Comment


            • #7
              Originally posted by Aeder View Post
              If Intel is actually going to go through with their GPU plans, wouldn't that put even more pressure on NVIDIA though?
              Yes, but that's a different thing. It's actual competition, from a company with a A LOT of money to burn and A LOT to lose. Intel litrally went shopping and snatched people from all over the place, AMD, NVIDIA, Qualcomm.

              It's not just some downstream vendor saying "I'm not buying your hardware, let's see who lasts longer".

              Comment


              • #8
                Originally posted by andre30correia View Post
                Since Redhat have such power in project, know with IBM why they don't try to push better support from nvidia, at least some power support or simple say we will stop supporting nvidia in our servers/workstations and only support amd and intel since we only work with open source
                Redhat (now IBM) support LTS distros that rarely get updated kernels. Thus using the binary driver doesn't cause any issues with them and updates.

                Comment


                • #9
                  Originally posted by duby229 View Post
                  I mean -anybody- who's used nvidia's driver with multiple monitors on linux knows exactly what I'm talking about
                  Not really.

                  I have three monitors and I've had at least two since before I switched to Linux. I've never not used multi-monitor nVidia on Linux, so my expectations for Linux GUIs as a whole were set by it.

                  The only thing I can guess at is the tearing problems, which I'm told are an nVidia-specific issue.

                  Comment


                  • #10
                    Originally posted by ssokolow View Post

                    Not really.

                    I have three monitors and I've had at least two since before I switched to Linux. I've never not used multi-monitor nVidia on Linux, so my expectations for Linux GUIs as a whole were set by it.

                    The only thing I can guess at is the tearing problems, which I'm told are an nVidia-specific issue.
                    That too, but specifically twinview is buggy. The most common bug is ctrl-alt-fkey to switch to a vt results in just a blank screen with just the amber light, only happens with twinview and is decades old. Another common bug is that most games see both monitors as one screen and stretch across both of them, also decades old bug. And another common bug is that it can only sync to vblank on the first monitor, although a work-around exists, it's also a decades old bug.

                    EDIT: It's exactly why I've always wondered how a linux user could be an nVidia fanboy.... It never made sense to me...

                    EDIT: Oh yeah, and the xrender acceleration is so bad, how could I forget that. It's terrible, so, sooo bad. nVidia needed to spend time learning how to accelerate xrender better a long time ago.
                    Last edited by duby229; 04-15-2019, 03:56 PM.

                    Comment

                    Working...
                    X