Announcement

Collapse
No announcement yet.

A Kernel Maintainer's Prediction On The CPU Architecture Landscape For 2030

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by skeevy420 View Post
    One thing I don't think was considered is all the lost trust in x86
    for most customers no trust was lost
    Originally posted by skeevy420 View Post
    for desktop users the only thing x86 has going for it anymore is that it plays games better than the rest. But thanks to Intel, x86 isn't much of a trusted platform these days so POWER has the potential to pull in x86 users that don't want to go to ARM.
    well, in some alternative reality where you can play x86 binary games on power or arm better than on x86

    Comment


    • #12
      Originally posted by Weasel View Post
      He's either a clown, or trolling.
      I would, honestly and objectively, like you to expand on what you perceive the technical reasons for your statement to be, and definitely not for the purposes of starting an argument.

      For a long time, I have been involved in the 'word-length-growth' of processors at a more technical level than most, starting with Intel's '4004 / 4040' and Japanese 4-bit CPUs, to where we are now.

      There was always an upward pressure, early on, so that, aside from the need to address more memory, more instructions could be accommodated by one word; an 8-bit word (field) for the Instruction Register allows for 2**8, or 256 BASIC instructions (let's forget the pedantic, artificial definition of 'word'). That 'problem' was solved completely with a 32-bit word's being able to contain not only the op-code for somewhat more than 256 instructions (how many basic instructions does one really need? How much memory do you HAVE to have?), but also any references to any of the machine's registers or memory locations (a quick 'sanity check': 2**32 = 4 Gigabytes--1 KB=1024 B) which might be dictated or implied by the instruction.

      The higher resolution and speed of conversion guaranteed by a 128-bit word-length used in analog-digital-analog applications is not even a valid reason--almost all A / D and D /A applications I have ever encountered are inherently limited to 32 bits--and usually much less--by the physical constraints of the processes themselves.
      .
      I would tend to lean more towards the fact that the pressure to move to 128-bit processors is due more to the 'mountain-climber's syndrome' than anything else: "...because it's there...". Or the 'hacker's syndrome: "...because I can..."; but I would really like to get another well-thought-out (actually, a lot of them) opinion on this subject.

      Comment


      • #13
        Originally posted by ALRBP View Post
        That said, silicon computer will definitely, and I believe quickly, hit their own physical limitation. Due to their size getting closer to atomic scale, integrated circuit components will stop to be more compact every other generation. At this point, there will still be room for optimization, but without transition to another base material than silicon, which will also have limits, a transition that could lead to serious changes in the market, the growth of computing power will slow down and stop.
        you have plenty of room to grow in third dimension

        Comment


        • #14
          Originally posted by Weasel View Post
          He's either a clown, or trolling.
          Sure, 640 kB ought to be enough for anybody.

          Comment


          • #15
            Originally posted by pal666 View Post
            you have plenty of room to grow in third dimension
            Yes but this implies increased energy at materials cost, unlike miniaturization. Also, and this is probably the main issue, more thermal dissipation and all the constraints that comes with it. A high-end CPU already dissipate about a quarter of a small electrical heater's power (>200W) ; a few years ago, I lived a (well isolated) small apartment where I shut down the main room heater due to my computer (AMD FX CPU) heating the room enough.
            Last edited by ALRBP; 31 August 2020, 11:16 AM.

            Comment


            • #16
              Originally posted by skeevy420 View Post
              Since modern x86 CPUs are said to be RISC-like at their core with additional features and whatnot added on, I wonder why AMD or Intel don't move on to RISC-V or (Open)Power and then figure out how to add on all the x86 stuff on, preferably in a modular, dual-cpu like, way so we can remove the hardware security hole if or when we don't need it. Imagine, instead of having to buy entire new systems every couple of years we could get by with buying a newer instruction set module.

              How much of your 2010-1014 hardware is still perfectly viable outside of not having AVX8675309? I feel like a plug-in interface for CPU instructions could reduce a lot of computing waste and, if they go that route, an open CPU platform is the way to go to prevent Intel SpecEx shenanigans that allow hackers to dump manure trucks behind windtunnel fans. Do we really want that much shit to hit the fan again?
              Essentially all of the security holes identified so far are related to the execution part of the CPU (ie the already-RISC part) and not to the x86 decode part AFAIK, so stripping off the x86 decode front end would not do much for security.
              Test signature

              Comment


              • #17
                Originally posted by ALRBP View Post

                The computer industry is apart when it comes to the "bio-physical constraints we're now hitting". Quarantine does not prevent from consuming virtual things (it's the opposite) and growth in the computer power is driven by size reduction, not energy consumption increase. That said, silicon computer will definitely, and I believe quickly, hit their own physical limitation. Due to their size getting closer to atomic scale, integrated circuit components will stop to be more compact every other generation. At this point, there will still be room for optimization, but without transition to another base material than silicon, which will also have limits, a transition that could lead to serious changes in the market, the growth of computing power will slow down and stop.
                It's interesting that you interpreted that by way of social and technological material constraints, ie pandemic lockdown and the end of Moore's Law, which are definitely real issues, and potentially significant and which do very much tie into what I was meaning. My point was the subject of Biophysical Economics, "Limits to Growth" and ecological collapse. The computer industry will have to adapt or die, as we all will, this is only drawn into tighter focus with the COVID-19 pandemic and particularly its effect on the energy industry. As you correctly state, quarantine doesn't directly prevent consumer spending, and has been a boon for online commerce, but that's not sustainable and isn't a model we can adopt to solve our many predicaments.

                Comment


                • #18
                  Originally posted by pal666 View Post
                  you have vivid imagination. x86 exists since late seventies, there was plenty of time for patents to expire
                  I think c117152 may have meant x86_64 specifically.

                  Comment


                  • #19
                    Originally posted by s_j_newbury View Post

                    It's interesting that you interpreted that by way of social and technological material constraints, ie pandemic lockdown and the end of Moore's Law, which are definitely real issues, and potentially significant and which do very much tie into what I was meaning. My point was the subject of Biophysical Economics, "Limits to Growth" and ecological collapse. The computer industry will have to adapt or die, as we all will, this is only drawn into tighter focus with the COVID-19 pandemic and particularly its effect on the energy industry. As you correctly state, quarantine doesn't directly prevent consumer spending, and has been a boon for online commerce, but that's not sustainable and isn't a model we can adopt to solve our many predicaments.
                    I understand your point but, for me, the computer industry will not have that much difficulty with "Limits to Growth and ecological collapse". Silicon is abundant and energy consumption is limited. As for the pandemic, sectors like aerial transport will be much more affected by ecological issues than computing. The crisis of some sectors could even mean more profit for the computing industry and its ability to provide worldwide communication with low energy consumption (compared to physical transport).

                    Now, I am absolutely not confident in any prediction on that subject, including mine. I think that the range of possible futures for humanity, even only a few decades ahead, goes from nuclear fusion removing energy constraint and education eliminating authoritarian regimes to general resources' drought and a deadly nuclear war caused by some populist dictator.

                    Comment


                    • #20
                      Originally posted by programmerjake View Post

                      I think c117152 may have meant x86_64 specifically.
                      Yup. Especially the SSE stuff is only a few months away from expiring.

                      Comment

                      Working...
                      X