Announcement

Collapse
No announcement yet.

Arch Linux Based EndeavourOS Begins Providing ARM Builds

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Jumbotron
    replied
    Originally posted by DanL View Post

    No, I said I had a hard time envisioning MS making ARM laptops and forcing current laptop makers to follow (I don't give a fsck about tablets/phones).



    Exactly.
    And yet the combined number of phones and tablets you don't give a fsck about outnumber the number of desktops and laptops in the world by an order of magnitude. That's a lot of hardware to develop software on. All of which is running ARM SoCs. it's only a matter of time that desktops and laptops will begin to move over to ARM if for nothing else but to consolidate development resources and code bases.

    Oh...look. Apple is ALREADY doing just that. As it always has been and will be forevermore, the rest of the computer industry headed up by Microsoft will follow suit by the end of the decade.

    if you are not already porting every single x86 app you have over to ARM you're already years behind the times. The industry has spoken. Power efficiency, silicon innovation, flexibility in hardware design paradigms, convergence and consolidation of code bases and frameworks. All these things and more are available to hardware and software manufacturers through the ARM ecosystem.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by DanL View Post

    Where? (And don't link me to a tablet with a detachable, sold-separately keyboard. I'm talking about real laptops here..)
    They have that and a couple weird ass dual screen things and one of the dual screens functions like a keyboard. One is ARM and the other, I think, is Intel based. FWIW, most "I can has office and emailz" and dumber users will be happier on the tablet with the keyboard dock system.

    From what I can tell, their traditional laptops use a combination of AMD, Intel, and Nvidia products. Nothing ARM outside of the above.

    Leave a comment:


  • DanL
    replied
    Originally posted by skeevy420 View Post
    Microsoft IS making ARM laptops
    Where? (And don't link me to a tablet with a detachable, sold-separately keyboard. I'm talking about real laptops here..)

    Leave a comment:


  • skeevy420
    replied
    Originally posted by DanL View Post

    No, I said I had a hard time envisioning MS making ARM laptops and forcing current laptop makers to follow (I don't give a fsck about tablets/phones).
    Microsoft IS making ARM laptops and other makers will follow.

    Exactly.
    Using anecdotal evidence as fact, like my position on ARM Windows, is how fake news spreads and how people like Trump become President.

    Leave a comment:


  • leipero
    replied
    Jumbotron I don't think there is a need to go there..., you ofc. did mention RISC-V directly as well as indirectly with "European Processor Initiative", hence there is a reason why I do not quote your post, because I'm refering to everything you've said in this thread in context, aside from "saving space" and making it more readable ofc.

    x86 might die, it's really beyond the scope of this conversation, we are speaking here about CISC vs RISC, and no one really argues agains RISC being better (more efficient) for specialized workloads. My point is, that RISC is basically usless for general purpose computing, and will stay like that forever, because the very definition of that concept. Again, unliess, the definition itself changes (but then, obviously..., we are not talking about same concepts).

    As far as I understand, the type of logic used in CPU design plays 0 role for transistor density, if you manage to cramp transistors at 1nm without leakage, you are done, logic used by those transistors is completely irelevant. The only issue you may face is the heat transfer, but again, logic used is irelevant, transistor density is what's relevant, and being RISC or CISC, with the same number of transistors and frequency, you will have exact same issue.

    I might be uninformed, and probably am, but one thing is for sure, I'm not backward looking, when I see the pattern that failed already, and the same conditions for it to fail again (when speaking about general purpose computing), it's kinda simple to conclude what will happen in the future, and you don't really need much more information beyond that, it's core principle you are looking at, not the "surface level" semi-relevant stuff.
    Last edited by leipero; 25 September 2020, 08:01 AM.

    Leave a comment:


  • DanL
    replied
    Originally posted by skeevy420 View Post
    You said you had a hard time seeing MS follow Apple and making ARM based devices so I replied with a link to a MS based ARM device with a purpose-built SOC & CPU like Apple is doing. That is literally the opposite of offering evidence that supports your assertion.
    No, I said I had a hard time envisioning MS making ARM laptops and forcing current laptop makers to follow (I don't give a fsck about tablets/phones).

    I suppose my anecdotal evidence of "they're not popular" supports your position
    Exactly.

    Leave a comment:


  • Jumbotron
    replied
    Originally posted by leipero View Post
    Jumbotron
    Good luck with that.
    Also, it seems like you are suggesting that RISC-V at some point might become CISC and take over x86 proprietary arch. Well sure, something, some day will take over x86, but it will still be CISC, not RISC. Not anytime soon tho.

    I'm not sure why are you attempting to convince others (me) that you are right, you believe so, I do not, and never will, because I'm convinced that CISC is the way and RISC is way too limited, you might be right only under the condition that actual definitions for both concepts change IMO, and tomorrow RISC becomes present day CISC.

    If after all that you took from me that i thought RISC-V might become CISC when I made no mention of RISC-V just shows your reading comprehension is quite poor. Not to mention your comprehension of the ARM / RISC revolution going on right before your very eyes.

    x86 is dying. It's dying because of its CISC nature regardless of how much RISC extensions they try to bung on to it carcass. You can NEVER, EVER take out enough of the CISC in an x86 chip to have a power efficient AND powerful CPU. Period. Full Stop. Intel tried with the Atom and it has been an ABYSMAL failure. So much so Intel got out of the Mobile and Tablet market with their tail dragging behind them.

    Couple this with the end of Moore's Law as we butt right up against Physics and Quantum effects, CISC based x86 chips and their inherit complexity means they will never get to 3nm without gutting what makes them x86....namely CISC. The transistor / size budget of the die is too hard to do. Hence Chiplets with their corresponding Memory Latency issues.

    RISC chips on the other...no problem. Apple Silicon SoCs are shipping right this moment in iPads and later this year iPhones with 5nm SoCs with 2 Big Cores and 4 Little Cores all running simultaneously at running over 2.5 Ghz clock. Along with that there is 128 kb of L1 data cache AND 128 kb of L1 Instruction cache AND 8mb of L2 cache. A 4 core GPU AND a 16 dedicated Neural Processor that can do 11 Trillion Ops / Sec. All of this and more in a 5nm package. Shipping now.

    Poor little Intel can't even bung a Neural Processor on their Xeons and have to instead settle for handing out Matrix Math extensions to their AVX vector registers in their Xeons sometime in 2021 if they are lucky and lately they can't even get out of the 10nm starting gate. Even AMD has gotten to 7nm....albeit TSMC's 7nm is more like Intels' 10nm + or at this point should we be calling Intel's process 10nm ++++++++++++++++++++++++++++++ LOL !!!

    Sorry my uninformed and backward looking friend. x86 is dying. It's legacy. Apple knows this. Microsoft knows this. Google knows this. By the end of this decade you may finally know this too.

    Leave a comment:


  • leipero
    replied
    Jumbotron
    Good luck with that.
    Also, it seems like you are suggesting that RISC-V at some point might become CISC and take over x86 proprietary arch. Well sure, something, some day will take over x86, but it will still be CISC, not RISC. Not anytime soon tho.

    I'm not sure why are you attempting to convince others (me) that you are right, you believe so, I do not, and never will, because I'm convinced that CISC is the way and RISC is way too limited, you might be right only under the condition that actual definitions for both concepts change IMO, and tomorrow RISC becomes present day CISC.

    Leave a comment:


  • Jumbotron
    replied
    Originally posted by leipero View Post

    You should work for ARM marketing team haha, nice story, not much truth in there tho.

    RISC isn't, and never will be the future, it exists as long as CISC if not longer (first computers were actually RISC). RISC does have it's usage in limited use case scenarios where compute power isn't needed, so "efficiency goes up" when you have less transistors and compute power..., like specialized hardware (mining ASICs that are basically useless for anything else) but a bit more flexible. In the same way, you have supercomputers that are "fastest" in given scenario (or few) while being completely useless for anything else.
    That by itself isn't a bad thing, and can be very useful and productive for specialized workloads and companies dealing with those things, but claiming that RISC is the present, future or whatever, it's nonsense, well, it isn't for "smartphones" and specialized cases, but that's where it ends.

    Your statement, "The more the merrier" is exactly the reason why it failed from the start to have any use in computing aside from specialized cases, and it is the reason why it will remain so for, well, forever.

    You're right. RISC isn't the future. It is the present. Here's reprint of what i just posted on the article on GCC wiring support for the ARM's Neoverse N2.

    Here is something QUITE INTERESTING !!

    It seems that the ARM Neoverse N2 Chip will be the basis for the European Processor Initiative. The EPI is a European wide consortium comprised of 27 vendor partners and 10 European countries which are determined to devise the Continent's Common computing platform ranging from Exascale HPC, A.I, Autonomous Autos, Servers and Cloud, Big Data, Space and Robotics.

    In other words....a Pan-European Data Integrity and Sovereign Compute platform. And x-86 is NOWHERE to be seen. Nor could it. That would be impossible.

    Furthermore it looks like RISC-V chips will play a BiG role in the ARM based European Common Compute Platform as their chips look to serve as an "Accelerator Tile" in the chip package for Data storage acceleration.

    According to a talk given by Linaro and on one of their slide sheets, ARM's Neoverse N2 will have ARM's latest ARM v8.6-A instruction additions which among other things will add General Matrix Multiply, Bfloat 16 support, additional SIMD Matix Math manipulators and enhancements to virtualzation.

    According to the timeline as set out by the EPI it looks like EPI ARM based CPU code named RHEA will arrive next year and the first EPI Exascale computer with RHEA chips in them will arrive by the end of 2021 or 2022.

    Here are the docs pertaining to this European Processor Initiative. First is the actual PDF of the EPI announcement and outline

    https://sos23.ornl.gov/wp-content/up...-Denis-EPI.pdf


    And secondly here is the PDF of slides used in the Linaro talk in March of 2020 on the state of ARM in HPC where they mention that the ARM Neoverse N2 "Zeus" CPU will be the basis for the European Processor Initiative.

    https://static.linaro.org/connect/lt...TD20-106-0.pdf



    Also.....

    And the ARM hits just keep on coming.

    " Strong hints have emerged that VMware is close to making the Arm version of its ESXi hypervisor a proper product. "

    https://www.theregister.com/2020/09/...w_esxi_on_arm/

    Leave a comment:


  • Jumbotron
    replied
    Originally posted by leipero View Post
    Jumbotron You seem convinced that RISC is the "future", and I see more and more people thinking that way.
    I'm not convinced at all, while it's extremely useful for limited devices (such as smartphones, some types of servers etc.), there are multiple issues with it for "PC" usage. One is that with limited instructions comes limited functionality, and that by itself isn't really a bad thing, what is bad, is the fact that with limited functionality comes flexibility and as an unavoidable consequence, comes diversity and lack of any "standard".

    If RISC is the future, I'm afraid it's "a bad one", where you can grade chips and have tighter control over device longevity. Take Android for exmaple, you have new device, 2 years in, well, your new device is either abandoned or there is security update you can't avoid for X application that just so happens to use another instructions "old chip" did not have, and boom = you now have basically useless chip that have 100X performance penalty that is otherwise very functional for your needs, and guess what? That application is just the begining, using developement tools from X (in this case google, but could be anyone), those instructions are enabled on every single application and OS itself at the end.

    How will that work on desktop PC? Hence why I'm not so convinced, in fact I'm very positive it will flop.

    Ahem.....what I said over at the article concerning GCC support for ARM's Neoverse N2


    Here is something QUITE INTERESTING !!

    It seems that the ARM Neoverse N2 Chip will be the basis for the European Processor Initiative. The EPI is a European wide consortium comprised of 27 vendor partners and 10 European countries which are determined to devise the Continent's Common computing platform ranging from Exascale HPC, A.I, Autonomous Autos, Servers and Cloud, Big Data, Space and Robotics.

    In other words....a Pan-European Data Integrity and Sovereign Compute platform. And x-86 is NOWHERE to be seen. Nor could it. That would be impossible.

    Furthermore it looks like RISC-V chips will play a BiG role in the ARM based European Common Compute Platform as their chips look to serve as an "Accelerator Tile" in the chip package for Data storage acceleration.

    According to a talk given by Linaro and on one of their slide sheets, ARM's Neoverse N2 will have ARM's latest ARM v8.6-A instruction additions which among other things will add General Matrix Multiply, Bfloat 16 support, additional SIMD Matix Math manipulators and enhancements to virtualzation.

    According to the timeline as set out by the EPI it looks like EPI ARM based CPU code named RHEA will arrive next year and the first EPI Exascale computer with RHEA chips in them will arrive by the end of 2021 or 2022.

    Here are the docs pertaining to this European Processor Initiative. First is the actual PDF of the EPI announcement and outline

    https://sos23.ornl.gov/wp-content/up...-Denis-EPI.pdf


    And secondly here is the PDF of slides used in the Linaro talk in March of 2020 on the state of ARM in HPC where they mention that the ARM Neoverse N2 "Zeus" CPU will be the basis for the European Processor Initiative.

    https://static.linaro.org/connect/lt...TD20-106-0.pdf

    Leave a comment:

Working...
X