Announcement

Collapse
No announcement yet.

Ubuntu MATE / Studio / Budgie All End Their 32-bit ISOs For New Releases

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by starshipeleven View Post
    For the love of Zod why using CRT monitors? There are bazillions of LCD screens with VGA ports.
    There are at least two big reasons for retro gaming I can think of. You can't use a lightgun with a LCD (old Nintendo/Sega/etc consoles) and LCDs have much more lag than CRTs.

    Comment


    • #32
      calc Also if you want to use native resolution and play the game as it was "ment to be played", LCD's are out of question, since my last CRT died last year, I find LCD's useless for NES/SNES games, and all the tricks of using OpenGL instead of software drawing and blurs and all sorts of "improvements" just make game look far worse compared to how it should look, plus, on 3 different low-end displays I have a problem with blur that turns "network like" 8-bit textures to look like a c**p when camera is moving (example, fence at first level in first title of Ninja Gaiden/Ryukenden), it just hurt my eyes and there's no way I can solve it (unless buying extremely expensive display that MIGHT solve that problem, but i seriously doubt).

      Comment


      • #33
        Originally posted by starshipeleven View Post
        For the love of Zod why using CRT monitors? There are bazillions of LCD screens with VGA ports.
        LCDs have a fixed resolution.
        Cheap LCDs with VGA ports tend to do very simple tricks when the input resolution doesn't match the output resolution, like only doing "nearest neighbour".

        Games tend to use a bazillion of different resolutions. Even more from small independent makers who have some "demoscene" roots and who tend to reprogram the resolution manually.
        None of these are exact divisor of the LCD division, so you'll have "weird uneven zoomed pixels".

        Also, at some point in time you'll want to reboot the old "legacy/vintage" gaming PC into an OS with network connection, if nothing else, for the purpose to download/upload stuff (newly discovered/obtained legacy games to try) from your NAS.

        Linux is such a possible OS. (The other solution is daring to connect an old/not patched anymore Windows to the network, or try to do some monstruosity under FreeDOS involving packter drivers, Arachne and a home web server).


        Of course, a yet different solution, is to use a CompactFlash -to-IDE pin adapter whose CF card slot is extrenally accessible. In that case you shut down your legacy machine, yank the CF card out, and plugit into a USB card reader (or a CF to PCCard16bit/PCMCIA pin adapter, if your laptop still support those).
        That's what some of the devs of the 8088mph vintage PC demo used.


        Comment


        • #34
          Originally posted by leipero View Post
          dungeon I understand marketing, but there is no real reason to move from 64-bit afaik, and making compatibility with new extensions for both 64 and 32 bit makes it even more complicated that I don't see it anyone doing it for that purpose.
          There's also no reason for laptops aimed at casual users (i.e. web, office and mail and not much else) to have 8 GB of RAM yet it's becoming more and more popular to do that. So they will always find a reason to do something, even to move to 128-bit. Also, I think by the time 128-bit truly becomes a thing, 32-bit will not even be considered as an extension.

          Also, IBM already has 128-bit high-end computers. Yes, I know that they're not aimed at people like us, but it does show that 128-bit already exists.

          Comment


          • #35
            Originally posted by Vistaus View Post
            There's also no reason for laptops aimed at casual users (i.e. web, office and mail and not much else) to have 8 GB of RAM yet it's becoming more and more popular to do that. So they will always find a reason to do something, even to move to 128-bit. Also, I think by the time 128-bit truly becomes a thing, 32-bit will not even be considered as an extension.
            You might be surprised at how much memory a few open modern webpages take. Facebook, Google Docs, etc take huge amounts of memory. I wouldn't recommend anyone get anything less than 8GB RAM in a system. Even the oldest system I still have, which is 10 years old, has 8GB in it, which was the most it could take. My newer systems all have 32GB and I'm frequently using over half of that. Pretty much any system other than the cheapest at somewhere like Wal-Mart has at least 8GB in it, a friend even gave me a system they didn't want anymore that had 12GB.

            Comment


            • #36
              Originally posted by leipero View Post
              dungeon Yes, but that is prediction "at historic rates", and in last decade plus it's not stagnant but it did slow down substantionaly. If I understand it corectly supercomputers use modular approach, so really it should not be required at all (there's always a posibility that I do not understand it in the right way). My point here is that there are real physical limitations approaching 5nm regardless of materials used, so what will come next, it's hard to say, if it's quantum computing, then it will require completely different approach in memory addressing, so whole talk about X-bits will become irelevant.

              At current rate of growth, even supercomputers are far away from 18 million terabytes in 2050, let alone 2030, and our little desktops will probably never use that amount of memory in billion years .
              Better believe in these historic rates... say in 2028. first 128bit Desktop CPU introduced (vast majority of course still use 64bit), and then in 2038. it become new norm (fifty-fifty rate) and finally in 2048. 64bit support dropping just like 32bit now

              We just needs to be alive to see that of course, but it will happen . So in about that 2050. we won't see 64bit used much anymore just like now 32bit. and in 2058. we (or if not we next gen will ) would see 64bit only in museums

              Comment


              • #37
                Originally posted by starshipeleven View Post
                Someone does and many post that on the internet, but it's a very niche thing.
                For the love of Zod why using CRT monitors? There are bazillions of LCD screens with VGA ports.
                Not for profit though, all times I need something I can get it for peanuts off ebay. Also netbooks are shit hardware to begin with.
                Lcd have tearing this is why Freesynch and and gsynch are a thing. The hrz on crt's is higher. I had 21 inch crt's with a refresh rate of 120hrz that's 120 fps tear free. the resolution went up to 2000 x whatever I don't remember. Ya it's a niche thing but I could of sold them Instead I put at the side of the road.

                crt are worse for the environment compered to lcd.

                Comment


                • #38
                  Originally posted by DrYak View Post
                  Of course, a yet different solution, is to use a CompactFlash -to-IDE pin adapter whose CF card slot is extrenally accessible. In that case you shut down your legacy machine, yank the CF card out, and plugit into a USB card reader (or a CF to PCCard16bit/PCMCIA pin adapter, if your laptop still support those).
                  Speaking of which, here's a cool piece of trivia that your phrasing suggests you know but many others don't: CompactFlash-to-IDE and CompactFlash-to-PCMCIA adapters are both passive. CompactFlash cards natively speak both PATA and PCMCIA and there's a pin on them which determines which mode they boot into.

                  Comment


                  • #39
                    Originally posted by leipero View Post
                    calc Also if you want to use native resolution and play the game as it was "ment to be played", LCD's are out of question, since my last CRT died last year, I find LCD's useless for NES/SNES games, and all the tricks of using OpenGL instead of software drawing and blurs and all sorts of "improvements" just make game look far worse compared to how it should look, plus, on 3 different low-end displays I have a problem with blur that turns "network like" 8-bit textures to look like a c**p when camera is moving (example, fence at first level in first title of Ninja Gaiden/Ryukenden), it just hurt my eyes and there's no way I can solve it (unless buying extremely expensive display that MIGHT solve that problem, but i seriously doubt).
                    There is no such thing as native resolution nor "meant to be played", that reminds me only of marketing slogan of some company . NES/SNES actually internally do 8:7, that is exactly correct pixel aspect ratio per what hw do... so it was streched to 4:3 anyway on CRTs (that is what most remember of it, but what most remember is slightly incorrect really):



                    It is really just 8:7 what hardware do, now some games had improperly fixing it for 4:3 because well most users used CRTs 4:3 ... so you have some of these games which looks best on one or another AR
                    Last edited by dungeon; 07 May 2018, 03:02 PM.

                    Comment


                    • #40
                      Originally posted by kneekoo View Post
                      The people using 15+ years old hardware surely have to understand that it's time to move on to something newer. Second-hand PCs capable of 64-bit software are quite cheap, and that's great news.
                      Why don't people understand that newer isn't necessarily better. My gorgeous 16:10 Precision laptop has amazing build quality and a screen that you can't buy anymore. With PAE I can use all 4GB of Ram even though my Core CPU is 32-bit only. Why push people to generate e-waste, when many 32-bit machines are still more than powerful enough for their users needs? For basic web browsing and SSH'ing that CPU is still overkill.

                      Originally posted by kneekoo View Post
                      As long as there's still new hardware being sold with low RAM, there's a need for 32-bit software. The software people mostly seem to ignore this, the users are obviously not technically apt enough to understand the problem, and here we are, looking at more distros taking the options away, letting a lot of people trash their storage devices with swap. Great!
                      It's not just low RAM consumer hardware. My company saved terabytes of RAM by running it's low RAM servers 32-bit. Delaying the transition to 64-bit allowed them to skip at least 1 HW upgrade cycle.
                      Last edited by slacka; 07 May 2018, 04:39 PM.

                      Comment

                      Working...
                      X