LACT Linux GPU Control Panel Adds Support For Intel Graphics

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • phoronix
    Administrator
    • Jan 2007
    • 67385

    LACT Linux GPU Control Panel Adds Support For Intel Graphics

    Phoronix: LACT Linux GPU Control Panel Adds Support For Intel Graphics

    In development for several years has been LACT as a Linux GPU Control Application to allow adjusting various GPU/driver settings via a convenient graphical application. AMD and NVIDIA graphics have been supported to date while now Intel graphics are also supported with the brand new LACT 0.7...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
  • Ferrum Master
    Phoronix Member
    • Feb 2024
    • 114

    #2
    I actually noticed it few days ago too... but... well you do not have working temp sensor or fan control for arc either way. You can change the power level thou... it looks like working.

    Comment

    • Anux
      Senior Member
      • Nov 2021
      • 1960

      #3
      Oh they let you reduce the power limit to 50 % and less? They should have asked AMD because they will tell you it destroys cards. ^_^
      Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

      Comment

      • Ferrum Master
        Phoronix Member
        • Feb 2024
        • 114

        #4
        Originally posted by Anux View Post
        Oh they let you reduce the power limit to 50 % and less? They should have asked AMD because they will tell you it destroys cards. ^_^
        https://www.phoronix.com/news/AMDGPU-Lower-Power-Limit
        Are you one of those who does not understand how undervolting core voltage can kill the card?

        Comment

        • Anux
          Senior Member
          • Nov 2021
          • 1960

          #5
          Originally posted by Ferrum Master View Post
          Are you one of those who does not understand how undervolting core voltage can kill the card?
          Not what this is about but yes I am. Care to explain that magical thing that no one teaches you in university? (if you were trying to be sarcastic a little similey would help)

          BTW, reducing TDP does not change voltages it just prevents the use of the highest pstates in high load scenarios.

          Comment

          • Quackdoc
            Senior Member
            • Oct 2020
            • 5095

            #6
            WHAT!? I didn't even know intel exposed these controls. Well, imma be having some good times

            EDIT: Changing the power limit doesn't seem to be actually be working for me on i915 6.12.9 on arch. power limit doesn't change anything, fans can't be changed. I have an A380 asrock challeneger which is a 75w card, but lact is still reporting a hard limit at around 50w with some bursts to 55w. It is nice for monitoring though.

            does anyone know if we can hook mangohud or something to display the monitoring info?

            EDIT2: ah, limiting clockspeed and limiting voltage does work. but I still can't get my 75w out of it T.T
            Last edited by Quackdoc; 15 January 2025, 11:03 AM.

            Comment

            • Anux
              Senior Member
              • Nov 2021
              • 1960

              #7
              Originally posted by Quackdoc View Post
              WHAT!? I didn't even know intel exposed these controls. Well, imma be having some good times
              Do you have an Intel dGPU? I would be interested how low you can really go with TDP. Sadly the current models with their driver problems and cut down PCIe interface are useless for me but maybe in the distant future it becomes an alternative, I want to avoid AMD GPUs if possible in the future.

              Comment

              • Ferrum Master
                Phoronix Member
                • Feb 2024
                • 114

                #8
                Originally posted by Anux View Post
                Not what this is about but yes I am. Care to explain that magical thing that no one teaches you in university? (if you were trying to be sarcastic a little similey would help)

                BTW, reducing TDP does not change voltages it just prevents the use of the highest pstates in high load scenarios.
                I wasn't sarcastic, I am amused. Reduced power state is directly tied to used voltage table, at least you got to learn Ohm's law in school. The thing you call magic is called Kirchhoff's Law and is basic of basics.

                GPU core consists of various domains driven by various voltages. Domains regarding vram controler, PLL, cache etc... those are not tied fully to vcore and often are fixed voltages especially I/O. Due to dielectric properties made in specific design you cannot make so that their voltages surpass certain threshold in between for example 0.5V like the popular LGA1366 CPUs had also well known rule. There is a datasheet for any silicon, that declares absolute maximums and rules, people at lab have tested stupid things for us and probably burned enough silicon, finding out that you cannot create voltage differential between two blocks, they leak and malfunction otherwise and lead to catastrophic failure.

                I kinda understand software guys not grasping the idea how the SoC is made, but don't get into false conclusions because of that...

                Comment

                • Quackdoc
                  Senior Member
                  • Oct 2020
                  • 5095

                  #9
                  Originally posted by Anux View Post
                  Do you have an Intel dGPU? I would be interested how low you can really go with TDP. Sadly the current models with their driver problems and cut down PCIe interface are useless for me but maybe in the distant future it becomes an alternative, I want to avoid AMD GPUs if possible in the future.
                  I was able to get it down to just 18w while I was playing a game using just the power usage limit slider, I don't currently have ASPM enabled so I won't be able to test any lower then that, but 18w seems reasonable when running 3d tasks. I will try enabling ASPM later and get back here.

                  Comment

                  • Anux
                    Senior Member
                    • Nov 2021
                    • 1960

                    #10
                    Originally posted by Quackdoc View Post
                    I was able to get it down to just 18w while I was playing a game using just the power usage limit slider
                    THX that's all I really needed to know.

                    Originally posted by Ferrum Master View Post
                    The thing you call magic is called Kirchhoff's Law and is basic of basics.
                    That doesn't explain how integrated transistors get destroyed when you lower their voltage. It just helps you calculate the voltage and current in an electrical network. Just a little tip for you, if you reduce the voltage, currents reduce as well, no need for Kirchhoff to come to this simple conclusion.
                    And for real integrated circuits at many GHz operating frequencies Kirchhoff is totally useless, you need to simulate the propagation of electrical fields to really know which current flows in which point of time and space.
                    GPU core consists of various domains driven by various voltages. Domains regarding vram controler, PLL, cache etc... those are not tied fully to vcore and often are fixed voltages especially I/O. Due to dielectric properties made in specific design you cannot make so that their voltages surpass certain threshold in between for example 0.5V like the popular LGA1366 CPUs had also well known rule.
                    Now you are talking about raising the Voltage? You first came with the under volting straw men and in the second post you are already at over volting? BTW over and under volting/clocking is not a problem for AMD and they are happy to allow it. They claim that TDP reduction will destroy your card.

                    I kinda understand software guys not grasping the idea how the SoC
                    We are far below the SOC level with this discussion.

                    Just another hint that might get you to think about what TDP reduction does. If my GPU runs idle it uses the lowest pstates (voltages and frequencies) resulting in a really low power consumption (TDP), why doesn't it get destroyed in idle?
                    If I'm in a temp limit the card automatically reduces the TDP but that isn't a problem either?
                    With older generations AMD had no problem letting you dial in 50% TDP, every other GPU manufacturer lets you do so too. But with RDNA3 it somehow destroys your card? Sure!

                    And BTW AMD themself sells CPUs with integrated GPUs that they (same die) deliver with different TDP levels. My 5700G-die is available from 15 W to 65 W and they let you decide in the BIOS of the 65 W edition if you want to use 45 W or 35 W instead. Why do their APUs not explode in laptops with 15 W TDP?

                    Comment

                    Working...
                    X