Announcement

Collapse
No announcement yet.

Nouveau Developer Working On OpenGL Extension To Help With Reverse-Engineering

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by The_Analyst View Post
    Will this be an optional extension for now, with hopes that it is later introduced into the full OpenGL spec?

    Will this be included in WebGL? If you're able to poke random registers on the GPU and see the results, it sounds like you're increasing your attack surface quite a lot.
    It's not going to be in the GL spec. No one except Mesa will be interested in it, because the other drivers are all provided by their own manufacturer - there's not a ton of reasons for them to add reverse-engineering support into their drivers.

    It sounds like they might not even register it at Khronos, and just keep it completely internal within Mesa.

    Comment


    • #12
      Originally posted by smitty3268 View Post
      No one except Mesa will be interested in it, because the other drivers are all provided by their own manufacturer - there's not a ton of reasons for them to add reverse-engineering support into their drivers.
      On the other hand, among the big desktop/workstation/server/HPC players, Intel and AMD have nothing to lose given that they are currently pushing for opensource drivers and even contributing dev time to that.
      So they don't have any incentive to veto the extension if the open-source crowd tries to push it aggressively into an official openGL version.

      Only Nvidia (and the countless embed GPU manufacturers) have something to lose from it.

      But then, if it gets miraculously accepted into 4.7, it won't necessarily find its way in the next Nvidia driver blobs:
      who will pay attention if OpenGL 4.7 is supported, now that all the attention has shifted to Vulkan ?

      Comment


      • #13
        This extension would really only be useful to enable a driver developer to pass tokens from GLSL to the driver's shader compiler, so that the shader compiler can then emit arbitrary instructions in place of that token. This is not something that is useable in a released version of any software -- it relies on a hacked up shader compiler to operate. This just facilitates providing arguments to the operation being debugged, and saving off its return value.

        Hopefully the driver developer then can more easily construct GLSL shaders that help figure out or verify what a particular operation does.

        Comment


        • #14
          Originally posted by duby229 View Post
          That too, but specifically twinview is buggy. The most common bug is ctrl-alt-fkey to switch to a vt results in just a blank screen with just the amber light, only happens with twinview and is decades old.
          I don't remember ever encountering that (I've been using nVidia since around 2002) but then I only ever Ctrl+Alt+FKey when my KWin borks from being left open for too many weeks on end and needs a forced restart, so I'm probably using it the way their testers are.

          Originally posted by duby229 View Post
          Another common bug is that most games see both monitors as one screen and stretch across both of them, also decades old bug.
          Now that I do know but I never realized it was nVidia-specific. Given that it only happens in certain games, I just assumed that some games weren't tested on multi-monitor desktops and would break on any of them... I then worked around it by combining windowed mode at the desired monitor's native resolution with a KWin preset to invent that "borderless windowed" fullscreen mode before it became a standard option.

          Originally posted by duby229 View Post
          And another common bug is that it can only sync to vblank on the first monitor, although a work-around exists, it's also a decades old bug.
          Not sure if I'm familiar with that. How is "first monitor" defined and what's the workaround?

          Originally posted by duby229 View Post
          EDIT: Oh yeah, and the xrender acceleration is so bad, how could I forget that. It's terrible, so, sooo bad. nVidia needed to spend time learning how to accelerate xrender better a long time ago.
          Fair enough.

          Comment


          • #15
            Originally posted by ssokolow View Post

            I don't remember ever encountering that (I've been using nVidia since around 2002) but then I only ever Ctrl+Alt+FKey when my KWin borks from being left open for too many weeks on end and needs a forced restart, so I'm probably using it the way their testers are.



            Now that I do know but I never realized it was nVidia-specific. Given that it only happens in certain games, I just assumed that some games weren't tested on multi-monitor desktops and would break on any of them... I then worked around it by combining windowed mode at the desired monitor's native resolution with a KWin preset to invent that "borderless windowed" fullscreen mode before it became a standard option.



            Not sure if I'm familiar with that. How is "first monitor" defined and what's the workaround?


            Fair enough.
            The work-around is don't use twinview, use zaphod instead. Not sure if that's still an option though.... Last time a tried zaphod it didn't work with the OSS drivers....

            EDIT: The first monitor is usually defined as "Screen0", but really that's just a variable name which you can change to whatever you want.
            Last edited by duby229; 18 April 2019, 10:12 AM.

            Comment


            • #16
              Originally posted by duby229 View Post
              Another common bug is that most games see both monitors as one screen and stretch across both of them, also decades old bug.
              I've been trying to figure out how to do this as a feature... but none of the games I play seem to do this. I have to use split screen on one monitor while the other just sits there with the desktop on it.

              Me <- Linux/Nvidia fan boy. (not really, but I go with whichever card gives me more performance, currently RTX2080 oc'd like crazy)

              Comment


              • #17
                Originally posted by skeetre View Post

                I've been trying to figure out how to do this as a feature... but none of the games I play seem to do this. I have to use split screen on one monitor while the other just sits there with the desktop on it.

                Me <- Linux/Nvidia fan boy. (not really, but I go with whichever card gives me more performance, currently RTX2080 oc'd like crazy)
                So what you're saying is you don't care about compatibility or standards compliance or code compliance or desktop performance, all you want is FPS.... Ok that's fine I guess, stupid, but fine....

                EDIT: What you -REALLY- should set up is a side by side comparison, one with an AMD setup and one with an nVidia setup. I dare you, please do it... It'll blow your mind how shitty nVidia graphics look when you have something that's actually correct sitting right next to it....

                But FPS is king right?
                Last edited by duby229; 21 May 2019, 08:23 AM.

                Comment


                • #18
                  I really don't care much about compatibility or standards compliance or code compliance. As long as it works for me... those would be nice to have, but mainly I just care that it works for me. I do want desktop performance, and FPS. I like to benchmark as much, if not more, than game, so yeah, FPS is king. I have set up a AMD RX 570 I got used from a guy at work on Linux, and it was a pain in the butt compared to installing the Nvidia binary driver. I got fed up with it and swtiched back to my GTX 770 for the linux system and put the AMD card in an older system to use in my guest room. I've heard/read people say that AMD looks better but for the life of me I can't tell the difference most of the time, and when I can, the Nvidia card looks and performs better - to me. So I didn't need to do a side by side comparison, I used the same system and same monitor with a 770 and a rx 570, and the 770 looked better, was easier to set up, and didn't crash like the 570 did. Now on windows, the rx 570 way out performed the nvidia card.
                  Last edited by skeetre; 22 May 2019, 08:28 AM.

                  Comment


                  • #19
                    Originally posted by skeetre View Post
                    I really don't care much about compatibility or standards compliance or code compliance. As long as it works for me... those would be nice to have, but mainly I just care that it works for me. I do want desktop performance, and FPS. I like to benchmark as much, if not more, than game, so yeah, FPS is king. I have set up a AMD RX 570 I got used from a guy at work on Linux, and it was a pain in the butt compared to installing the Nvidia binary driver. I got fed up with it and swtiched back to my GTX 770 for the linux system and put the AMD card in an older system to use in my guest room. I've heard/read people say that AMD looks better but for the life of me I can't tell the difference most of the time, and when I can, the Nvidia card looks and performs better - to me. So I didn't need to do a side by side comparison, I used the same system and same monitor with a 770 and a rx 570, and the 770 looked better, was easier to set up, and didn't crash like the 570 did. Now on windows, the rx 570 way out performed the nvidia card.
                    Get two monitors and set them side by side and do it again. You -MOST- definitely won't say that bullshit again....

                    EDIT: I don't understand how you could have any trouble with an AMD card, you install the card, you boot -any- linux distro and -BOOM-, it just works.... How could that possibly be harder?
                    Last edited by duby229; 22 May 2019, 09:42 AM.

                    Comment


                    • #20
                      I don't have the time or care to appease you. I don't have 2 matching monitors to test with. I had problems with the AMD card, not the Nvidia card. I like how my Nvidia card looks. I'm happy with it. As I said, it looks better -to me-. Why do you care if I prefer Nvidia? I'm sure the vast majority of the tech community, not Linux, but gamers, developers, IT, etc, would prefer an RTX 2080 over an rx 570 also, not that it matters, because -I- prefer it.

                      But, btw, tried a simple google search, and I'm by far not the only person having issues with the AMD rx 570 and Linux.

                      Comment

                      Working...
                      X