Announcement

Collapse
No announcement yet.

Debian Dropping A Number Of Old Linux Drivers Is Angering Vintage Hardware Users

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by fransdb View Post
    Follow the suggestion of @skeevy420 and read about it. It is also running longer then the last month. By the way, obsolete hardware does not exist, except in the eye of the beholder. I still run my old Z80 system (1980) every month or so. It has my inventory program running in 64KB, and two floppy disks, written in a language called Pilot. It still runs, no pressing need to rebuild it and having to require hundreds of MB to run nowadays. Making my point that you do not need for every task the newest and fastest hardware.
    Obsolete isn't the same thing as non-functional. Despite Commodore being long dead the Amiga still has a cult following with an active demo scene, but these people don't expect modern software to support their machines even with more recent PPC or FPGA accelerator cards like the Vampire. I really don't see old PPC macs or other machines from the 90s and very early 2000s as being any different in this regard.

    Also, yes. That really is an OCS Amiga demo from last year.

    Originally posted by zxy_thf View Post
    ...
    The original research behind RISC, or Reduced Instruction Set Computing, showed that CPUs spend of their time running a relatively small number of simple instructions and that a big instruction set with lots of complex instructions will lose out to one that has uses the same transistor budget for fewer, but faster instructions along with things like superscalarity, cache and branch prediction. Particularly superscalarity has a huge performance benefit and it's much more effective and way easier to implement with few instructions that take equally long to run (rather than say Motorola 68k where different immediate instructions could take everything from 4 to 20 cycles). First they proved this in theory, then they proved it in practice and since then it's more or less been how CPUs are made. Even x86, running interpreted microcode and thus allowing for truly vast changes on the inside without breaking compatibility, has been pretty RISC-like on the inside for decades already.

    As for the efficiency of ARM compared to x86 it's mostly to do with it starting out as a clean sheet RISC architecture and not having to do the on-the-fly microcode interpretation. Having a clean and simple instruction set is still as big of an advantageous as it ever was. Both x86 and ARM have for a long time been so-called "Post-RISC"-architectures with very RISC-like cores with better and better vector instruction units tacked on for math operations.
    Last edited by L_A_G; 28 April 2020, 01:18 PM.

    Comment


    • #32
      [QUOTE=fransdb;n1173849
      The general feeling is that as long as it works, you want to have the latest OS because of new features and/or security updates. When that falls away, your perfectly working device just - overnight - turns into a heap of scrap.[/QUOTE]

      You're right, it happens overnight because Debian always decides to do this kind of stuff last-minute, no announcement whatsoever... oh wait.

      Comment


      • #33
        Originally posted by linuxdesktop View Post
        One of the things that makes Debian great is the backward compatibility.
        Are you sure about that? 'Cause every time the backwards compatibility argument comes up (no matter the site or forum), Windows gets all the praise.

        Comment


        • #34
          I remember the Rage 128 that I still have in a box somewhere. Running aMule on Windows 98 with the latest drivers would crash the kernel but it was rock-solid on Linux.

          Comment


          • #35
            Originally posted by zexelon View Post
            The fact that you can even get a modern distro running on 20 year old hardware is a testament to how much legacy code is still in the Linux system... I highly doubt you would get Windows 10 running with any drivers on a 20 year old system.... havent tried it myself though so maybe its possible.
            Windows 10 is more widely supported than Linux-only users think. The basic requirement is a WDDM 1.0 driver (Vista driver), so you chances are you would get any DX9 GPU from AMD or Nvidia working. The latest build of Win10 is also still supporting 32-bit systems.
            I tried a Geforce 6600 myself on Win10 prior to selling it to test its functionality, and it seemed to "work". Only tested browsing and such but it provide the higher resolutions and said hw acceleration was working.


            And for all good that open source in theory provides, I really wish people would stop saying that users who are upset simply could contribute themselves. Relatively few have the engineering competence to write drivers. It's just such a bad argument.
            Last edited by Vulkan; 21 April 2020, 11:52 AM.

            Comment


            • #36
              Originally posted by schmidtbag View Post
              * Drop packages that will likely never see themselves being used (like drivers only found on PCIe devices, anything that depends on OpenGL 3.x, anything that depends on modern instruction sets, etc)
              I agree with your general sentiment (cut-off anything requiring PCIe, maintain older drivers, etc), but I've got a bit of a counterexample here:

              One of my hobby machines is a 1996-ish Compaq (RIP DEC) Alpha Personal Workstation 500a. 500Mhz 64-bit CPU, 640MB RAM (I've got enough sticks to get it to 1.5GB if I can ever port the memtest functions over to Alpha and diagnose some issues), and I bought a PCI Radeon 5400 to drive the display(s). It'll happily do GL 3.3, and I'm not sure if it supports enough to do GL 4.0 if FP64 emulation is in place). Gnome Shell runs, although swapping to a PIO mode 4 IDE drive sucks(16.6 MB/s theoretical max, no DMA due to chipset bug, realistically 1-6MB/s at 100% CPU).

              Yes, this machine is probably unique, and all that really runs on this architecture these days is either some flavor of BSD or Gentoo. I went with Gentoo.

              Comment


              • #37
                Originally posted by fransdb View Post

                I take offence to the above posting. Clearly posted by someone who shows lack of empathy.
                LOL. Yeah i am an evil fascist who wants to force people still on early Pentium 90mhz PCs to upgrade to something more recent if they want to use the latest year 2020 Debian version...

                Comment


                • #38
                  Originally posted by zexelon View Post
                  Un-maintained "legacy" code is a security risk. Even if it passes all the regression testing and what not, its still sitting there causing "side affects". Shrinking a code base is always the best move, however obviously a balance between compatibility and maintainability must always be struck.

                  If someone is still using 20 year old hardware, they should really be running 20 year old software on it in my opinion. Software and hardware are always intrinsically linked. The fact that you can even get a modern distro running on 20 year old hardware is a testament to how much legacy code is still in the Linux system... I highly doubt you would get Windows 10 running with any drivers on a 20 year old system.... havent tried it myself though so maybe its possible.
                  There is no chance in hell of running Windows 10 on anything 20 years old. Back then high end PCs used to have around 128MB of RAM. Windows 10 need 1GB baseline, and that's the absolute minimum, anything lower and you will be hitting virtual memory even when loading the interface at startup....

                  Comment


                  • #39
                    Originally posted by fransdb View Post

                    That is, your assuming - so it seems - that these people don't use more modern hardware, at least if I look at your last words about "..the world moves one.". Some people have build a past life, with memories and feelings. Holding to some of them is not wrong and show others that there was a life before that age. Young people will find themself there too, after some years. So be aware of qualifying (explicit or implicit) expressions.
                    I might be older than you think. And I don't have anything against vintage hardware (I regret throwing away my first owned PC) but my thinking was more of don't hold modern hardware back by keeping around old drivers or accustom to outdated hardware. There is certainly a place for a (modern) vintage distribution (let's call it Potato Linux) as the user schmidtbag outlined in this thread.

                    Comment


                    • #40
                      Originally posted by Creak View Post

                      The CPU performance between x86_64 and ARM is actually more about the CPU architecture than the bus width (32 bits vs 64 bits). ARM has a simpler, more optimized set of instructions while x86_64 has a huge set of instructions (mainly through extensions like SSE, AVX, and the likes). This would actually be the main performance difference (and a huge one). If you take two 64 bits CPUs, one ARM64 and one x86_64, you'll see that, at equal frequencies, ARM will beat of x86_64 easily. The counterpart of that is that most of the programs are compiled for x86_64 only... so we need to wait until we can see official Windows 10 machines sold on ARM64 CPUs; there we might see the real rise of ARM CPUs in the desktop market.
                      ARM cores are nowhere near x64 cores in IPC or max clocks dude.... Like, seriously, stop drinking the ARM kool-aid. Yes, ARM cores are impressive on phones and tablets, but expecting them to somehow compete with Intel and AMD on the desktop is delusional. I often times see people, even "tech journalists" , claim this thing for a decade, and it hasn't happened yet, and never will. ARM cores are designed for low space and low energy, they do not scale on the desktop well. If ARM wants to bring that ISA on the desktop, they have to change the architecture considerably to compete. At best, they can create netbooks/chromebook alternatives and or small itx boxes. And it will be hardly worth it for people who want to run Windoze.

                      The opposite is more likely: Intel and AMD shrinking the x64 cores so much that they can put them into tablets and phones. x64 cores are already too small as it is, a few process nodes later we might be able to get 4 or 8 x64 cores in a phone if we clock them lower. Imagine having x64 software, even windoze, on your phone.... ARM will lose a lot of customers then.

                      Also contrary to popular belief, the cost of using x64 instruction set is TINY these days. You know, 7nm and all....

                      Comment

                      Working...
                      X