Announcement

Collapse
No announcement yet.

Arch Linux Based EndeavourOS Begins Providing ARM Builds

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Jumbotron View Post
    ... There ARE NO ORIGINAL IDEAS LEFT IN THE WORLD. Only "original" and "inventive" ways at repackaging. It's been that way for centuries with the written word and with music just to name two examples. And Apple has been THE MOST "original" and "Inventive" of ANY computer and tech company, particularly in the consumer space, at taking someone else's inventive and original idea and marketing to the people. ...
    Overall historical summary might prove your ideas very wrong. From the view of a senior manager, every decision we make has good or bad aspects to it. Your claims of absolute certainty seem so childish. Apple has made and continues many treacherous dead-ends & errors. Magic & marketing fears create so much "fascination", like the current POTUS, as Leader Of The Free World (2016 - 2020), his definitions of Corporate Greed being so good, and his versions of science & technology being "fake news".

    Apple is famous for using East Asian technologies of every type and size. It uses magic and legal brutality against every worker, supplier and person, every organization, and against most nations. It seems that it matters not where any outsider is upstream or downstream to Apple. They are all open to attack & exploitation, any time and all the time.

    Apple knows best how to exploit East Asians, who use USA manufacturing technologies, and Hollywood gullibility. Congratulations to the Apple shareholders, which probably included many Microsoft shareholders. Microsoft rescued Apple from financial bankruptcy, Microsoft so needed Apple to "succeed", to avoid corporate break up of the Microsoft "monopoly". Microsoft applications and money saved Microsoft the monopoly "mistake", but ensured that the shareholders of both company stocks were "safe".

    Originally posted by Jumbotron View Post
    ... The only difference between Apple and Microsoft is that Apple copies from straight up legit tech marvels toiling away at companies run by boneheads who don't know the value of what their propeller heads have in the basement, and then Apple perfects said technology and releases it to the public as opposed to just being an obscure document in a patent office for the rest of time. And the WHOLE TIME makes said copied technology ACTUALLY USEFUL to the larger public instead of just for a bunch of Command Line and Terminal geeks. ...
    Apple stole the WIMP ideas of Xerox PARC. This is standard evolution of cognitive science and human ergonomics. The CLI stuff evolved software & hardware together, with many contributors & innovators. Not just Apple, and not only Apple. Oldies like myself created & tried many of these ideas in hardware & software. Apple often imitated us, such as allowing more than one button or more than one option, hardware & software. Apple continues to imitate other innovators, in KDE Plasma and Android, hardware & software.

    Comment


    • #32
      Originally posted by DanL View Post

      I don't understand why you challenge my assertion and then go on to offer a bunch of evidence that supports it.
      You said you had a hard time seeing MS follow Apple and making ARM based devices so I replied with a link to a MS based ARM device with a purpose-built SOC & CPU like Apple is doing. That is literally the opposite of offering evidence that supports your assertion.

      I suppose my anecdotal evidence of "they're not popular" supports your position

      Comment


      • #33
        Jumbotron You seem convinced that RISC is the "future", and I see more and more people thinking that way.
        I'm not convinced at all, while it's extremely useful for limited devices (such as smartphones, some types of servers etc.), there are multiple issues with it for "PC" usage. One is that with limited instructions comes limited functionality, and that by itself isn't really a bad thing, what is bad, is the fact that with limited functionality comes flexibility and as an unavoidable consequence, comes diversity and lack of any "standard".

        If RISC is the future, I'm afraid it's "a bad one", where you can grade chips and have tighter control over device longevity. Take Android for exmaple, you have new device, 2 years in, well, your new device is either abandoned or there is security update you can't avoid for X application that just so happens to use another instructions "old chip" did not have, and boom = you now have basically useless chip that have 100X performance penalty that is otherwise very functional for your needs, and guess what? That application is just the begining, using developement tools from X (in this case google, but could be anyone), those instructions are enabled on every single application and OS itself at the end.

        How will that work on desktop PC? Hence why I'm not so convinced, in fact I'm very positive it will flop.

        Comment


        • #34
          Originally posted by Paradigm Shifter View Post
          Ooh. I think I'll give this a go; as much as I like Manjaro, it's always nice to have options. My XU4 isn't doing much right now.

          Maybe I've been lucky, but I've never had AUR builds fail due to Manjaro being out of sync with Arch as a whole. That said, I don't use that many AUR apps.


          It's funny, because Apple have a history of letting others take a risk first, stealing... uh, "reimagining" it, marketing it as new and somehow everyone falls over themselves thinking it's the greatest idea since fire. See: Xerox, tablet PCs, "smart" watches, portable media players, all-in-one PCs...


          I think it was in an old book somewhere: "There is nothing new under the sun."

          Ecclesiastes Chapter 1, Verse 9

          "History merely repeats itself. It has all been done before. Nothing under the sun is truly new."

          New Living Translation - The Bible

          Written somewhere around 2,300 years ago. The musings of King Solomon. Supposedly one of the wisest men to have ever lived. I think maybe that is correct.
          Last edited by Jumbotron; 24 September 2020, 11:56 AM.

          Comment


          • #35
            Originally posted by leipero View Post
            Jumbotron You seem convinced that RISC is the "future", and I see more and more people thinking that way.
            I'm not convinced at all, while it's extremely useful for limited devices (such as smartphones, some types of servers etc.), there are multiple issues with it for "PC" usage. One is that with limited instructions comes limited functionality, and that by itself isn't really a bad thing, what is bad, is the fact that with limited functionality comes flexibility and as an unavoidable consequence, comes diversity and lack of any "standard".

            If RISC is the future, I'm afraid it's "a bad one", where you can grade chips and have tighter control over device longevity. Take Android for exmaple, you have new device, 2 years in, well, your new device is either abandoned or there is security update you can't avoid for X application that just so happens to use another instructions "old chip" did not have, and boom = you now have basically useless chip that have 100X performance penalty that is otherwise very functional for your needs, and guess what? That application is just the begining, using developement tools from X (in this case google, but could be anyone), those instructions are enabled on every single application and OS itself at the end.

            How will that work on desktop PC? Hence why I'm not so convinced, in fact I'm very positive it will flop.

            You're about 20 years too late in your assumptions that RISC is the future. RISC is the present. x86 is now the past. The Future occurred the moment the original ARM engineers came back to a test system that had been powered down and found it was still computing while running on nothing but the stored power in the capacitors of the mainboard.

            By an order of several magnitudes there are more RISC chips being used today than all x-86 gadgets combined. That will only be solidified and increased once Apple by the end of 2022 at the very latest transition their entire hardware stack over to ARM and RISC. The ENTIRE rest of the hardware manufacturers on the Wintel side and even Google on the ChromeOS side will follow suit. They already are. 2020 alone will have a record number of ARM and RiSC offerings in terms of Chromebooks and that will only accelerate in 2021. Microsoft is already making ARM a First Class citizen when it comes to Windows and Windows app development this year. That will also accelerate in 2021.

            Why is RISC, spearheaded by ARM the Present and NOT the future? The best compute power per watt efficiency by far over x-86, customization of cores, core usage, add-ons such as DPUs, NPUs, FPGAs, DPSs, etc as an integral part of the ARM IP and engineering ecosystem, flexibility in packaging and BOM costs. This is just a VERY SMALL number of reasons that ARM and RISC are the PRESENT not the future of computing.

            Only with RISC can you build an entire software and hardware stack that can go from an IoT sensor, a Smart Watch all the way to the world's fastest and most power efficient Supercomputer. It is physically impossible to do that with x-86. That's TODAY.....the PRESENT....not the future.

            You need to wake up. The world has already moved on from x-86. x-86 is just a niche player now only really useful in the realms of "Big Iron" HPC and Supercomputers. Where computing is REALLY relevant for the Consumer is in the world of RISC and primarily ARM, although we are seeing continued use of MIPS, PowerPC and now the new player on the RISC field RISC-V. The more the merrier in my opinion. Let a thousand RISC flowers bloom. It's long past time to send x-86 to the same Silicon heap as the Zilog Z-80, the MOS Technology 6510 from the Commodore 64, the DEC Alpha and the Intel Itanium.

            Comment


            • #36
              Originally posted by gregzeng View Post

              Overall historical summary might prove your ideas very wrong. From the view of a senior manager, every decision we make has good or bad aspects to it. Your claims of absolute certainty seem so childish. Apple has made and continues many treacherous dead-ends & errors. Magic & marketing fears create so much "fascination", like the current POTUS, as Leader Of The Free World (2016 - 2020), his definitions of Corporate Greed being so good, and his versions of science & technology being "fake news".

              Apple is famous for using East Asian technologies of every type and size. It uses magic and legal brutality against every worker, supplier and person, every organization, and against most nations. It seems that it matters not where any outsider is upstream or downstream to Apple. They are all open to attack & exploitation, any time and all the time.

              Apple knows best how to exploit East Asians, who use USA manufacturing technologies, and Hollywood gullibility. Congratulations to the Apple shareholders, which probably included many Microsoft shareholders. Microsoft rescued Apple from financial bankruptcy, Microsoft so needed Apple to "succeed", to avoid corporate break up of the Microsoft "monopoly". Microsoft applications and money saved Microsoft the monopoly "mistake", but ensured that the shareholders of both company stocks were "safe".



              Apple stole the WIMP ideas of Xerox PARC. This is standard evolution of cognitive science and human ergonomics. The CLI stuff evolved software & hardware together, with many contributors & innovators. Not just Apple, and not only Apple. Oldies like myself created & tried many of these ideas in hardware & software. Apple often imitated us, such as allowing more than one button or more than one option, hardware & software. Apple continues to imitate other innovators, in KDE Plasma and Android, hardware & software.

              Please...spare us your snowflake tears about Apple exploiting East Asians. EVERY Western Civ. Corporation exploits East Asians simply because East Asian leaders LET their East Asian workers be exploited by Western Civ. Corporations.

              Also spare us your sanctimonious whining about Apple copying everyone. The idiots running Xerox had NO clue as to what their engineers had and those early computing innovations would have NEVER seen the light of day without Steve Jobs and Steve Wozniak and Apple et. al. bringing them out to the public.

              If you ARE going to sanctimoniously whine about copying tech, then please reserve a bit of that whining for dear old Linus Torvalds and the creators and maintainers of KDE and GNOME. The Linux kernel is a COMPLETE rip off of UNIX. KDE is a COMPLETE ripoff of the Windows DE and now with KDE 5.x.x a complete and UTTER ripoff of the Windows 10 DE and GNOME is a ripoff of the MacOS DE, albeit less so towards MacOS as KDE is more so Windows. Even ChromeOS is a ripoff of GNOME 3. Just take a look at the ChromeOS screen displaying all the ChromeOS apps and then hit the App icon on GNOME and look at that screen. The layout of icons and even the search bar at the top of the window is nearly identical in ChromeOS to GNOME. If you move the GNOME dock from the left side to the bottom of the desktop then GNOME and ChromeOS are just about 100% the same. Go into GNOME's settings and look at that window and then do the same in ChromeOS settings. Once again...nearly 100% identical.

              What's the takeaway ??

              Apple copied Xerox
              Microsoft copied Apple who copied Xerox
              Steve Jobs then copied Apple and Motif while using UNIX to form NEXT computing and NEXT Step.
              Microsoft tried to copy NEXT by creating Xenix but failed.
              Apple bought NEXT and then the MacOS copied (moved over) to NEXT Step
              Apple copied Nokia for the iPhone "Smart Phone"
              Apple copied Alan Kay's "Dynabook" concept from 1972 for the iPad tablet...( and if you REALLY want to get picky they copied it from "2001: A Space Odyssey"
              Microsoft AND Google copied Apple who copied Alan Kay and Nokia in their smart phones and tablets. Only Google was successful.
              Microsoft and Intel tried to copy the Apple Air laptop design which was an also an extension of Alan Kay's Dynabook design from 1972 in the sorry spectacle which was the "Netbook"
              Google then copies the Wintel Netbook which itself was a copy of the Apple Air laptop which was a copy by extension of Alan Kay's 1972 Dynabook in Google's Chromebook which runs Linux which is Linux Torvald's complete and utter rip off of UNIX. And the ChromeOS DE is an UTTER ripoff of GNOME 3 with the only difference that the ChromeOS dock is defaulted to the bottom of the desktop a la MacOS. And Gnome 3 is a ripoff of MacOS. And MacOS is a ripoff of Xerox Star workstation.

              And round and round we go on the copy merry go round. NOBODY...and I mean NOBODY is clean in ripping off someone else. So stop being a hypocrite about it.
              Last edited by Jumbotron; 24 September 2020, 04:43 PM.

              Comment


              • #37
                Originally posted by Jumbotron View Post
                You're about 20 years too late in your assumptions that RISC is the future. RISC is the present. x86 is now the past. The Future occurred the moment the original ARM engineers came back to a test system that had been powered down and found it was still computing while running on nothing but the stored power in the capacitors of the mainboard.

                By an order of several magnitudes there are more RISC chips being used today than all x-86 gadgets combined. That will only be solidified and increased once Apple by the end of 2022 at the very latest transition their entire hardware stack over to ARM and RISC. The ENTIRE rest of the hardware manufacturers on the Wintel side and even Google on the ChromeOS side will follow suit. They already are. 2020 alone will have a record number of ARM and RiSC offerings in terms of Chromebooks and that will only accelerate in 2021. Microsoft is already making ARM a First Class citizen when it comes to Windows and Windows app development this year. That will also accelerate in 2021.

                Why is RISC, spearheaded by ARM the Present and NOT the future? The best compute power per watt efficiency by far over x-86, customization of cores, core usage, add-ons such as DPUs, NPUs, FPGAs, DPSs, etc as an integral part of the ARM IP and engineering ecosystem, flexibility in packaging and BOM costs. This is just a VERY SMALL number of reasons that ARM and RISC are the PRESENT not the future of computing.

                Only with RISC can you build an entire software and hardware stack that can go from an IoT sensor, a Smart Watch all the way to the world's fastest and most power efficient Supercomputer. It is physically impossible to do that with x-86. That's TODAY.....the PRESENT....not the future.

                You need to wake up. The world has already moved on from x-86. x-86 is just a niche player now only really useful in the realms of "Big Iron" HPC and Supercomputers. Where computing is REALLY relevant for the Consumer is in the world of RISC and primarily ARM, although we are seeing continued use of MIPS, PowerPC and now the new player on the RISC field RISC-V. The more the merrier in my opinion. Let a thousand RISC flowers bloom. It's long past time to send x-86 to the same Silicon heap as the Zilog Z-80, the MOS Technology 6510 from the Commodore 64, the DEC Alpha and the Intel Itanium.
                You should work for ARM marketing team haha, nice story, not much truth in there tho.

                RISC isn't, and never will be the future, it exists as long as CISC if not longer (first computers were actually RISC). RISC does have it's usage in limited use case scenarios where compute power isn't needed, so "efficiency goes up" when you have less transistors and compute power..., like specialized hardware (mining ASICs that are basically useless for anything else) but a bit more flexible. In the same way, you have supercomputers that are "fastest" in given scenario (or few) while being completely useless for anything else.
                That by itself isn't a bad thing, and can be very useful and productive for specialized workloads and companies dealing with those things, but claiming that RISC is the present, future or whatever, it's nonsense, well, it isn't for "smartphones" and specialized cases, but that's where it ends.

                Your statement, "The more the merrier" is exactly the reason why it failed from the start to have any use in computing aside from specialized cases, and it is the reason why it will remain so for, well, forever.

                Comment


                • #38
                  Originally posted by leipero View Post
                  Jumbotron You seem convinced that RISC is the "future", and I see more and more people thinking that way.
                  I'm not convinced at all, while it's extremely useful for limited devices (such as smartphones, some types of servers etc.), there are multiple issues with it for "PC" usage. One is that with limited instructions comes limited functionality, and that by itself isn't really a bad thing, what is bad, is the fact that with limited functionality comes flexibility and as an unavoidable consequence, comes diversity and lack of any "standard".

                  If RISC is the future, I'm afraid it's "a bad one", where you can grade chips and have tighter control over device longevity. Take Android for exmaple, you have new device, 2 years in, well, your new device is either abandoned or there is security update you can't avoid for X application that just so happens to use another instructions "old chip" did not have, and boom = you now have basically useless chip that have 100X performance penalty that is otherwise very functional for your needs, and guess what? That application is just the begining, using developement tools from X (in this case google, but could be anyone), those instructions are enabled on every single application and OS itself at the end.

                  How will that work on desktop PC? Hence why I'm not so convinced, in fact I'm very positive it will flop.

                  Ahem.....what I said over at the article concerning GCC support for ARM's Neoverse N2


                  Here is something QUITE INTERESTING !!

                  It seems that the ARM Neoverse N2 Chip will be the basis for the European Processor Initiative. The EPI is a European wide consortium comprised of 27 vendor partners and 10 European countries which are determined to devise the Continent's Common computing platform ranging from Exascale HPC, A.I, Autonomous Autos, Servers and Cloud, Big Data, Space and Robotics.

                  In other words....a Pan-European Data Integrity and Sovereign Compute platform. And x-86 is NOWHERE to be seen. Nor could it. That would be impossible.

                  Furthermore it looks like RISC-V chips will play a BiG role in the ARM based European Common Compute Platform as their chips look to serve as an "Accelerator Tile" in the chip package for Data storage acceleration.

                  According to a talk given by Linaro and on one of their slide sheets, ARM's Neoverse N2 will have ARM's latest ARM v8.6-A instruction additions which among other things will add General Matrix Multiply, Bfloat 16 support, additional SIMD Matix Math manipulators and enhancements to virtualzation.

                  According to the timeline as set out by the EPI it looks like EPI ARM based CPU code named RHEA will arrive next year and the first EPI Exascale computer with RHEA chips in them will arrive by the end of 2021 or 2022.

                  Here are the docs pertaining to this European Processor Initiative. First is the actual PDF of the EPI announcement and outline

                  https://sos23.ornl.gov/wp-content/up...-Denis-EPI.pdf


                  And secondly here is the PDF of slides used in the Linaro talk in March of 2020 on the state of ARM in HPC where they mention that the ARM Neoverse N2 "Zeus" CPU will be the basis for the European Processor Initiative.

                  https://static.linaro.org/connect/lt...TD20-106-0.pdf

                  Comment


                  • #39
                    Originally posted by leipero View Post

                    You should work for ARM marketing team haha, nice story, not much truth in there tho.

                    RISC isn't, and never will be the future, it exists as long as CISC if not longer (first computers were actually RISC). RISC does have it's usage in limited use case scenarios where compute power isn't needed, so "efficiency goes up" when you have less transistors and compute power..., like specialized hardware (mining ASICs that are basically useless for anything else) but a bit more flexible. In the same way, you have supercomputers that are "fastest" in given scenario (or few) while being completely useless for anything else.
                    That by itself isn't a bad thing, and can be very useful and productive for specialized workloads and companies dealing with those things, but claiming that RISC is the present, future or whatever, it's nonsense, well, it isn't for "smartphones" and specialized cases, but that's where it ends.

                    Your statement, "The more the merrier" is exactly the reason why it failed from the start to have any use in computing aside from specialized cases, and it is the reason why it will remain so for, well, forever.

                    You're right. RISC isn't the future. It is the present. Here's reprint of what i just posted on the article on GCC wiring support for the ARM's Neoverse N2.

                    Here is something QUITE INTERESTING !!

                    It seems that the ARM Neoverse N2 Chip will be the basis for the European Processor Initiative. The EPI is a European wide consortium comprised of 27 vendor partners and 10 European countries which are determined to devise the Continent's Common computing platform ranging from Exascale HPC, A.I, Autonomous Autos, Servers and Cloud, Big Data, Space and Robotics.

                    In other words....a Pan-European Data Integrity and Sovereign Compute platform. And x-86 is NOWHERE to be seen. Nor could it. That would be impossible.

                    Furthermore it looks like RISC-V chips will play a BiG role in the ARM based European Common Compute Platform as their chips look to serve as an "Accelerator Tile" in the chip package for Data storage acceleration.

                    According to a talk given by Linaro and on one of their slide sheets, ARM's Neoverse N2 will have ARM's latest ARM v8.6-A instruction additions which among other things will add General Matrix Multiply, Bfloat 16 support, additional SIMD Matix Math manipulators and enhancements to virtualzation.

                    According to the timeline as set out by the EPI it looks like EPI ARM based CPU code named RHEA will arrive next year and the first EPI Exascale computer with RHEA chips in them will arrive by the end of 2021 or 2022.

                    Here are the docs pertaining to this European Processor Initiative. First is the actual PDF of the EPI announcement and outline

                    https://sos23.ornl.gov/wp-content/up...-Denis-EPI.pdf


                    And secondly here is the PDF of slides used in the Linaro talk in March of 2020 on the state of ARM in HPC where they mention that the ARM Neoverse N2 "Zeus" CPU will be the basis for the European Processor Initiative.

                    https://static.linaro.org/connect/lt...TD20-106-0.pdf



                    Also.....

                    And the ARM hits just keep on coming.

                    " Strong hints have emerged that VMware is close to making the Arm version of its ESXi hypervisor a proper product. "

                    https://www.theregister.com/2020/09/...w_esxi_on_arm/

                    Comment


                    • #40
                      Jumbotron
                      Good luck with that.
                      Also, it seems like you are suggesting that RISC-V at some point might become CISC and take over x86 proprietary arch. Well sure, something, some day will take over x86, but it will still be CISC, not RISC. Not anytime soon tho.

                      I'm not sure why are you attempting to convince others (me) that you are right, you believe so, I do not, and never will, because I'm convinced that CISC is the way and RISC is way too limited, you might be right only under the condition that actual definitions for both concepts change IMO, and tomorrow RISC becomes present day CISC.

                      Comment

                      Working...
                      X