Announcement

Collapse
No announcement yet.

GNOME 3.37.2 Released As Another Step Towards GNOME 3.38

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by You- View Post
    Unfortunately it is difficult to figure out which one is default unless they add in some speed tests or heuristics.
    Or a user selectable list that tells the system which one is the default... but it's Gnome, hence the UI must not be cluttered with potentially useful things unless those are understood and commonly used by the last of the monkeys too.


    Originally posted by You- View Post
    On my PC the integrated GPU only has DisplayPort and my Monitor is HDMI, so the discrete GPU is used as the default. I can't reasonably expect the feature to work better - if they used the name "more powerful", it wouldn't be correct. If they use discrete/not discrete, it would fail at the situation where both GPUs are discrete.
    They aren't supposed to do anything of that crap. They can just ask the user or, at the very least, let the user choose by means of some "Advanced" UI configuration window.

    Originally posted by You- View Post
    Besides, the naming is from the Freedesktop specification which they implemented. If it can be fixed it needs to be fixed there.
    The Freedesktop specification actually tells why they ended up with such a brain-damaged name:

    "
    If true, the application prefers to be run on a more powerful discrete GPU if available, which we describe as “a GPU other than the default one” in this spec to avoid the need to define what a discrete GPU is and in which cases it might be considered more powerful than the default GPU.
    "
    So who wrote the spec realized the spec could not bring in the concepts of "discrete" and "poweful", and worked around the problem by choosing a stupid name, binding it to the most meaningless data type one can dream of: boolean.

    Interestingly enough, who implemented that crap didn't stop a second to think about the best way to implement it, but he/she simply went on, assuming it was a Good Thing (TM) as defined by the spec. I can't find the git repo where they keep the spec webpage, so I can't check, but I'd bet who wrote the spec is the same one who implemented it in Gnome...

    The real problem here is that the specification is broken by design for being a boolean. It should have been a opaque string, so that the implementors could store anything meaningful to bind the app to a specific GPU. That way the property name could have simply been "PreferredGPU".
    Last edited by lucrus; 08 June 2020, 03:11 AM.

    Comment


    • #12
      Originally posted by tildearrow View Post

      What if I am using a 4K monitor (and using the high performance GPU by default because the integrated one is too slow for that resolution)?

      NonDefault is the integrated one, right?!
      Yes ...or rather I hope so, and I guess that's why they named the property like that: so we can switch the defaults but still have the property's name make sense and the property itself be useful, i.e. by setting the dGPU as default and then have the non-default iGPU render the less demanding stuff and pass them on to the dGPU. If the property's name had been PrefersDiscreteGPU, then the name would not really make sense in such a case.

      Originally posted by lucrus View Post

      The real problem here is that the specification is broken by design for being a boolean. It should have been a opaque string, so that the implementors could store anything meaningful to bind the app to a specific GPU. That way the property name could have simply been "PreferredGPU".
      So you're saying that the user should have to manually create or edit the .desktop launcher for each and every application on their system in order to hardcode their preferred GPU?

      I think a better option would be to keep the setting as a boolean that only needs to be set as an exception and not as the rule (i.e. only for the apps that actually need to launch on a non-default GPU) and at the same time expose in the settings UI an interface for the user to choose which GPU should be the default, much like they can do with the audio device.

      Other than that bit, I really don't find anything as particularly brain-damaged in the current implementation...
      Last edited by Nocifer; 08 June 2020, 04:13 AM. Reason: Making bold statements without being 100% sure they're actually true is A Bad Thing™

      Comment


      • #13
        Originally posted by Nocifer View Post

        Yes ...or rather I hope so, and I guess that's why they named the property like that: so we can switch the defaults but still have the property's name make sense and the property itself be useful, i.e. by setting the dGPU as default and then have the non-default iGPU render the less demanding stuff and pass them on to the dGPU. If the property's name had been PrefersDiscreteGPU, then the name would not really make sense in such a case.
        ...but who in the world would want to do that?

        Why not call it PrefersGPU as a compromise then?

        (so that it can be set to Discrete, Integrated or All ("don't care"))

        Or maybe allow setting PrefersNonDefaultGPU to Discrete? (so that it makes a bit more sense)

        Comment


        • #14
          Originally posted by Nocifer View Post
          So you're saying that the user should have to manually create or edit the .desktop launcher for each and every application on their system in order to hardcode their preferred GPU?
          I'm not sure how my words could be understood that way, but no, I'm not saying that. Implementors are not users. Implementors are DE developers (Gnome developers, KDE developers and so on). They could provide users with a UI to choose the best GPU for the app, and users obviously do not need to choose one for every app: desktop files missing a "PreferredGPU" property just use the default GPU.

          Originally posted by Nocifer View Post
          I think a better option would be to keep the setting as a boolean that only needs to be set as an exception and not as the rule (i.e. only for the apps that actually need to launch on a non-default GPU) and at the same time expose in the settings UI an interface for the user to choose which GPU should be the default, much like they can do with the audio device.
          That's another possible solution, that counts only for two GPUs, where one is the default and the other one is the preferred one by some 3D app. Let alone the fact that it assumes the default one to be the less powerful, it does not consider the case where there are more than 2 GPUs. Sure, it's not that common to have 3 GPUs and assuming there are at most two can cover 99% of use cases, but if the "specification" (not the implementation) can be future proof at a cost around zero, why not going for it?

          Originally posted by Nocifer View Post
          Other than that bit, I really don't find anything as particularly brain-damaged in the current implementation...
          I didn't say "brain-damaged implementation", but "brain-damaged name in the specification". It's a whole different thing.
          Last edited by lucrus; 08 June 2020, 05:17 AM.

          Comment


          • #15
            Fictious situation:
            - The computer has a CPU with an integrated GPU (= APU or some Intel variants)
            - It also has two discrete graphic cards, paired by SLI

            Question: which is the not default card?
            - The discrete cards can be removed, the integrated not, so it should be the primary, most elemental card, not?
            - Is SLI considered as one or as two cards?
            - Which of the two identical discrete cards is the first and which is the second? Is one of these more default than the other?
            - What's with multi-CPU boards that have multiple iGPUs in their sockets?
            - What's with different cards combined for use by one application, e.g. via Vulkan?

            If anything or anybody picks this up, it will be haunting us for the next decade and leaves us shaking our head what we were just thinking in the past.
            Last edited by reba; 08 June 2020, 05:05 AM.

            Comment


            • #16
              Originally posted by tildearrow View Post

              What if I am using a 4K monitor (and using the high performance GPU by default because the integrated one is too slow for that resolution)?

              NonDefault is the integrated one, right?!
              I presume that, in that case, you just completely disable the integrated one, in which case the system will see a single GPU...
              Last edited by rastersoft; 08 June 2020, 05:03 AM.

              Comment


              • #17
                Originally posted by rastersoft View Post

                I presume that, in that case, you just completely disable the integrated one, in which case the system will see a single GPU...
                No, because I have two monitors, one is 4K and it is the default one, the other is hooked to the cheap GPU, and I use it for something else it does not need 4K but it's useful to see nevertheless.

                Comment


                • #18
                  Originally posted by reba View Post
                  Fictious situation:
                  If anything or anybody picks this up, it will be haunting us for the next decade and leaves us shaking our head what we were just thinking in the past.
                  That's why we should let the user choose the correct GPU for the apps that need to run on a specific one. Not a boolean, but a string that identifies the GPU.

                  Comment


                  • #19
                    Originally posted by lucrus View Post

                    No, because I have two monitors, one is 4K and it is the default one, the other is hooked to the cheap GPU, and I use it for something else it does not need 4K but it's useful to see nevertheless.
                    That is possible? Wow...

                    Comment


                    • #20
                      Originally posted by lucrus View Post

                      No, because I have two monitors, one is 4K and it is the default one, the other is hooked to the cheap GPU, and I use it for something else it does not need 4K but it's useful to see nevertheless.
                      Anyway... AFAIK one thing are the framebuffers, and another thing are the GPUs. If I'm right, you are using the framebuffer of each graphic card, but you still can use one GPU for both to do hardware acceleration, which makes sense because less powerful GPUs consume less power.

                      Comment

                      Working...
                      X