Announcement

Collapse
No announcement yet.

A Kernel Maintainer's Prediction On The CPU Architecture Landscape For 2030

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • sebastianlacuesta
    replied
    Originally posted by Weasel View Post
    He's either a clown, or trolling.
    Sure, 640 kB ought to be enough for anybody.

    Leave a comment:


  • pal666
    replied
    Originally posted by ALRBP View Post
    That said, silicon computer will definitely, and I believe quickly, hit their own physical limitation. Due to their size getting closer to atomic scale, integrated circuit components will stop to be more compact every other generation. At this point, there will still be room for optimization, but without transition to another base material than silicon, which will also have limits, a transition that could lead to serious changes in the market, the growth of computing power will slow down and stop.
    you have plenty of room to grow in third dimension

    Leave a comment:


  • danmcgrew
    replied
    Originally posted by Weasel View Post
    He's either a clown, or trolling.
    I would, honestly and objectively, like you to expand on what you perceive the technical reasons for your statement to be, and definitely not for the purposes of starting an argument.

    For a long time, I have been involved in the 'word-length-growth' of processors at a more technical level than most, starting with Intel's '4004 / 4040' and Japanese 4-bit CPUs, to where we are now.

    There was always an upward pressure, early on, so that, aside from the need to address more memory, more instructions could be accommodated by one word; an 8-bit word (field) for the Instruction Register allows for 2**8, or 256 BASIC instructions (let's forget the pedantic, artificial definition of 'word'). That 'problem' was solved completely with a 32-bit word's being able to contain not only the op-code for somewhat more than 256 instructions (how many basic instructions does one really need? How much memory do you HAVE to have?), but also any references to any of the machine's registers or memory locations (a quick 'sanity check': 2**32 = 4 Gigabytes--1 KB=1024 B) which might be dictated or implied by the instruction.

    The higher resolution and speed of conversion guaranteed by a 128-bit word-length used in analog-digital-analog applications is not even a valid reason--almost all A / D and D /A applications I have ever encountered are inherently limited to 32 bits--and usually much less--by the physical constraints of the processes themselves.
    .
    I would tend to lean more towards the fact that the pressure to move to 128-bit processors is due more to the 'mountain-climber's syndrome' than anything else: "...because it's there...". Or the 'hacker's syndrome: "...because I can..."; but I would really like to get another well-thought-out (actually, a lot of them) opinion on this subject.

    Leave a comment:


  • pal666
    replied
    Originally posted by skeevy420 View Post
    One thing I don't think was considered is all the lost trust in x86
    for most customers no trust was lost
    Originally posted by skeevy420 View Post
    for desktop users the only thing x86 has going for it anymore is that it plays games better than the rest. But thanks to Intel, x86 isn't much of a trusted platform these days so POWER has the potential to pull in x86 users that don't want to go to ARM.
    well, in some alternative reality where you can play x86 binary games on power or arm better than on x86

    Leave a comment:


  • pal666
    replied
    Originally posted by c117152 View Post
    as the x86 patents dry out there will be new x86 players
    you have vivid imagination. x86 exists since late seventies, there was plenty of time for patents to expire

    Leave a comment:


  • carewolf
    replied
    Originally posted by Weasel View Post
    He's either a clown, or trolling.
    I assume he has no idea what he is talking about. If you talk about register width and processed bits per operation we already have 512bit processors, if you talk about addressing, there is no point. He could mean 128bit floating point, they are starting to appear in hardware, but under that logic x86 (with FPU) was an 80bit processor

    Leave a comment:


  • ALRBP
    replied
    Originally posted by s_j_newbury View Post
    I expect we'll be in an economic depression for the next decade at least, until a new economic system emerges which will be adapted to operate within the context of the converging bio-physical constraints we're now hitting.

    To my mind, this will mean a lot of repurposing of existing tech with much less focus of new hardware on consumer products due to unaffordability, while high end will look to the best value performance options which may provide an opportunity for RISC-V and POWER with the "consumer subsidy" removed.
    The computer industry is apart when it comes to the "bio-physical constraints we're now hitting". Quarantine does not prevent from consuming virtual things (it's the opposite) and growth in the computer power is driven by size reduction, not energy consumption increase. That said, silicon computer will definitely, and I believe quickly, hit their own physical limitation. Due to their size getting closer to atomic scale, integrated circuit components will stop to be more compact every other generation. At this point, there will still be room for optimization, but without transition to another base material than silicon, which will also have limits, a transition that could lead to serious changes in the market, the growth of computing power will slow down and stop.

    Leave a comment:


  • Hi-Angel
    replied
    Hmm, no quantum computing mentioned?

    Leave a comment:


  • milkylainen
    replied
    I don't think that was much of a sharp-minded prediction at all.
    He did not touch the more interesting subjects at all.
    Oh well...

    Leave a comment:


  • skeevy420
    replied
    One thing I don't think was considered is all the lost trust in x86, because, let's face it, for desktop users the only thing x86 has going for it anymore is that it plays games better than the rest. But thanks to Intel, x86 isn't much of a trusted platform these days so POWER has the potential to pull in x86 users that don't want to go to ARM. It'll be interesting to see what happens once Wine & Hangover become "gamer ready" on ARM and POWER because those are the two most primed to take x86's desktop/workstation spot.

    For servers, the architecture is more moot since we're talking about Linux and how, for the most part, Linux is pretty architecture agnostic and runs the same everywhere. They'll pick an architecture based on being good at either low-power or high speed computing depending on their needs...which makes me wonder:

    Since modern x86 CPUs are said to be RISC-like at their core with additional features and whatnot added on, I wonder why AMD or Intel don't move on to RISC-V or (Open)Power and then figure out how to add on all the x86 stuff on, preferably in a modular, dual-cpu like, way so we can remove the hardware security hole if or when we don't need it. Imagine, instead of having to buy entire new systems every couple of years we could get by with buying a newer instruction set module.

    How much of your 2010-1014 hardware is still perfectly viable outside of not having AVX8675309? I feel like a plug-in interface for CPU instructions could reduce a lot of computing waste and, if they go that route, an open CPU platform is the way to go to prevent Intel SpecEx shenanigans that allow hackers to dump manure trucks behind windtunnel fans. Do we really want that much shit to hit the fan again?

    Leave a comment:

Working...
X