Announcement

Collapse
No announcement yet.

Linux 5.9 Adding New Knob To Control Default Boost Value For Real-Time Workloads

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by AmericanLocomotive View Post
    I mean a PS4 is also using a ~10 year old CPU core on 28nm. It's using a not-insignificant amount of available CPU resources just to stream and decode 1080p video.
    Wait, a console, the "pinnacle of optimization" and also an embedded device isn't using hardware acceleration for media playback? WHAT? I refuse do believe it.

    It has the same VCE media decode/encode as the card it was based on, the hardware is perfectly capable of doing all it needs, and it's a locked down system so there should be no bullshit because DRM reasons.
    Last edited by starshipeleven; 03 August 2020, 11:50 AM.

    Comment


    • #12
      Originally posted by starshipeleven View Post
      Why adding a whole new CPU with all that complexity (or even a big.LITTLE x86 setup) when you can just shut down cores?
      I really doubt a couple cores of a PS4 would consume more than a whole ARM SoC, and that's plenty CPU to just run the GUI and feed the media stream to hardware accelerators.

      Linux can disable CPU cores since ages ago https://unix.stackexchange.com/quest...essor-on-linux and some smartphone CPU schedulers (like "hotplug") will also shut down cores dynamically depending on demand.

      Now can FreeBSD do this? Checkmate, BSD fans.
      Simplest answer is that seems to be the way of the future, the way things are trending -- right now the solution is to basically add more cores ever since they hit that slightly above 5ghz brick wall in regards to raw frequency speeds. The problem with that solution is add more cores isn't the best solution for a lot of things since some code can only be multi-threaded so much and things that can be very parallelized are ran on GPUs and whatnot.

      So what can they do?

      One possible option is that they mix architectures. What is ARM's specialty? Not using a lot of power. For example, my new phone can last 4 days on a single charge. That makes ARM a great choice to run an OS and that part of the software. Android is a great test case to show that 4-8 ARM cores and 4GB ram is plenty of resources to run a Linux-based desktop environment. For a console or a home PC such a setup would mean being able to do a shitload of tasks without needing to access the power sucking, money costing, heat generating, premium performance x86_64 cores.

      It's not like the basic frameworks aren't in place with hybrid GPUs, multi-CPU NUMA systems, different big.LITTLE setups, and CPU schedulers already.

      But, think about it, outside of games, 3d modeling software, certain emulators, and other specialized programs, ARM is plenty fast and capable enough for most of us, especially in optimized environments like iPhones and some Androids, so it isn't like we need all the extra extensions and features x86_64 brings for a lot of the OS.

      Comment


      • #13
        Originally posted by AmericanLocomotive View Post
        I mean a PS4 is also using a ~10 year old CPU core on 28nm. It's using a not-insignificant amount of available CPU resources just to stream and decode 1080p video. My modern Ryzen 3900X isn't even slightly phased by software 1080p decode.

        Most ARM devices also aren't even using the ARM core(s) to do any streaming/decode. Those SoC almost always have a dedicated hardware video decode block that does most of the heavy lifting. The PS4 also has hardware video decode, but it may not be compatible with Netflix's codec (so they fall back to a software mode) or the decoder may just not be very power efficient.
        All I know is that I can hear the fans kick on when the PS4 is doing what should be pretty basic tasks with things that you'd normally assume is using hardware decoding like most audio and video playback on modern hardware. Since my 10yo, 28nm PC with a newer GPU doesn't even need the GPU fans to kick on during Netflix, I don't see why the PS4 should...unless I need to take my Dad's apart because it's full of dust and crud...that's always a possibility...but that PS4 has always been a little fan happy.

        Comment


        • #14
          Originally posted by skeevy420 View Post
          One possible option is that they mix architectures. What is ARM's specialty? Not using a lot of power. For example, my new phone can last 4 days on a single charge.
          And it does so by disabling cores, both during runtime and by going into deep sleep (i.e. all cores + other stuff shut down) when you turn off the screen, waking up on a timer to check stuff and then going back to sleep. Plus aggressively offloading everyting to hardware acceleration, be it media or display rendering.

          Also microcontrollers in IoT applications where battery matters do the same, even an Arduino microcontroller can go in "sleep mode" and be waken up by IO or timer from its onboard management controller, and if you do that they can last nearly a year with a coin cell (the cell used in RTC clock in a PC).

          For a console or a home PC such a setup would mean being able to do a shitload of tasks without needing to access the power sucking, money costing, heat generating, premium performance x86_64 cores.
          Premium x86 performance in a console? PS4 is using a crappy laptop APU with a bigger GPU, don't make me (laugh in Intel).

          A couple of its cores are perfectly fine, and aren't going to use significantly more power than a couple ARM cores, the PS4 isn't battery powered so you don't need to care about power so much.

          As an example for you disbeliever, I have a couple HP thinclients T610 with a dualcore AMD APU of the same generation of PS4 https://support.hp.com/us-en/document/c03235347 I can use them to watch 1080p media (on Windows anyway), and they are completely fanless. And support 4k displays. And can actually use up to 16 GB of RAM regardless of what HP spec sheet says. And they have normal Sata connectors inside so you can just slap in a normal Sata SSD. And they have a normal UEFI firmware so it's 100% a PC. Works fine with Windows 10. Should be cheap on ebay and such, a lot of corporations are dumping them as it's the end of their lifecycle.

          Comment


          • #15
            Originally posted by skeevy420 View Post

            All I know is that I can hear the fans kick on when the PS4 is doing what should be pretty basic tasks with things that you'd normally assume is using hardware decoding like most audio and video playback on modern hardware. Since my 10yo, 28nm PC with a newer GPU doesn't even need the GPU fans to kick on during Netflix, I don't see why the PS4 should...unless I need to take my Dad's apart because it's full of dust and crud...that's always a possibility...but that PS4 has always been a little fan happy.
            I have limited experience with consoles, but if fans spin too much it's always a good idea to check if the heatsink is clogged. Laptops can manage to get full of fine wool-like stuff and I guess a console can too.

            That said, I wouldn't be surprised if a console is cutting costs by not optimizing power management.
            Last edited by starshipeleven; 04 August 2020, 10:14 AM.

            Comment


            • #16
              Originally posted by starshipeleven View Post
              I have limited experience with consoles, but if fans spin too much it's always a good idea to check if the heatsink is clogged. Laptops can manage to get full of fine wool-like stuff and I guess a console can too.

              That said, I wouldn't be surprised if a console is cutting costs by not optimizing power management.
              It's not mine, it's my Dad's. Technically it's mine...I bought it, had it for two months, and then I realized I had only played around 45 minutes of Witcher 3 in that time so I gave it to him. He uses the damn thing everyday. That was like the first or second year after the PS4 was released. It would be a real dick move to take it back now.

              I wouldn't be too surprised either on cost cutting either. It is only a console plugged into the wall; not like they need to think that hard on tuning it.

              Comment


              • #17
                Originally posted by starshipeleven View Post
                And it does so by disabling cores, both during runtime and by going into deep sleep (i.e. all cores + other stuff shut down) when you turn off the screen, waking up on a timer to check stuff and then going back to sleep. Plus aggressively offloading everyting to hardware acceleration, be it media or display rendering.

                Also microcontrollers in IoT applications where battery matters do the same, even an Arduino microcontroller can go in "sleep mode" and be waken up by IO or timer from its onboard management controller, and if you do that they can last nearly a year with a coin cell (the cell used in RTC clock in a PC).
                I used to be so hardcore into custom Android roms, schedulers, kernels, and all that stuff. Just saying I get why it does it because I've really been into tuning them.

                Premium x86 performance in a console? PS4 is using a crappy laptop APU with a bigger GPU, don't make me (laugh in Intel).

                A couple of its cores are perfectly fine, and aren't going to use significantly more power than a couple ARM cores, the PS4 isn't battery powered so you don't need to care about power so much.
                I was thinking more about the thermal shortcomings of the current, out now consoles and ways to fix those problems on the next gens. A low power, low heat generating ARM setup for the OS and apps would help with that.

                I was also thinking along the eco-hippie lines in regards to power usage. I know it's a wall-plugging console that doesn't necessarily need to care about using less power, but a dual architecture setup could potentially save a lot of energy and resources and that a fixed platform console would be a decent place to treat as an alpha/beta platform.

                As an example for you disbeliever, I have a couple HP thinclients T610 with a dualcore AMD APU of the same generation of PS4 https://support.hp.com/us-en/document/c03235347 I can use them to watch 1080p media (on Windows anyway), and they are completely fanless. And support 4k displays. And can actually use up to 16 GB of RAM regardless of what HP spec sheet says. And they have normal Sata connectors inside so you can just slap in a normal Sata SSD. And they have a normal UEFI firmware so it's 100% a PC. Works fine with Windows 10. Should be cheap on ebay and such, a lot of corporations are dumping them as it's the end of their lifecycle.
                Saw this new dual core 64bit AMD something-or-other in the news earlier today that only pulled 6 watts. Now I want to see what happens as AMD and Intel start ripping on ARM's LITTLE.big setups. I also feel sorry for Michael because it'll be hectic to accurately benchmark these things; maybe having to do three or four different benchmarks per system...forcing to fast cores, forcing to slow cores, letting stuff do whatever....

                Comment

                Working...
                X