Announcement

Collapse
No announcement yet.

Ubuntu MATE / Studio / Budgie All End Their 32-bit ISOs For New Releases

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • ssokolow
    replied
    Originally posted by DrYak View Post
    Of course, a yet different solution, is to use a CompactFlash -to-IDE pin adapter whose CF card slot is extrenally accessible. In that case you shut down your legacy machine, yank the CF card out, and plugit into a USB card reader (or a CF to PCCard16bit/PCMCIA pin adapter, if your laptop still support those).
    Speaking of which, here's a cool piece of trivia that your phrasing suggests you know but many others don't: CompactFlash-to-IDE and CompactFlash-to-PCMCIA adapters are both passive. CompactFlash cards natively speak both PATA and PCMCIA and there's a pin on them which determines which mode they boot into.

    Leave a comment:


  • PackRat
    replied
    Originally posted by starshipeleven View Post
    Someone does and many post that on the internet, but it's a very niche thing.
    For the love of Zod why using CRT monitors? There are bazillions of LCD screens with VGA ports.
    Not for profit though, all times I need something I can get it for peanuts off ebay. Also netbooks are shit hardware to begin with.
    Lcd have tearing this is why Freesynch and and gsynch are a thing. The hrz on crt's is higher. I had 21 inch crt's with a refresh rate of 120hrz that's 120 fps tear free. the resolution went up to 2000 x whatever I don't remember. Ya it's a niche thing but I could of sold them Instead I put at the side of the road.

    crt are worse for the environment compered to lcd.

    Leave a comment:


  • dungeon
    replied
    Originally posted by leipero View Post
    dungeon Yes, but that is prediction "at historic rates", and in last decade plus it's not stagnant but it did slow down substantionaly. If I understand it corectly supercomputers use modular approach, so really it should not be required at all (there's always a posibility that I do not understand it in the right way). My point here is that there are real physical limitations approaching 5nm regardless of materials used, so what will come next, it's hard to say, if it's quantum computing, then it will require completely different approach in memory addressing, so whole talk about X-bits will become irelevant.

    At current rate of growth, even supercomputers are far away from 18 million terabytes in 2050, let alone 2030, and our little desktops will probably never use that amount of memory in billion years .
    Better believe in these historic rates... say in 2028. first 128bit Desktop CPU introduced (vast majority of course still use 64bit), and then in 2038. it become new norm (fifty-fifty rate) and finally in 2048. 64bit support dropping just like 32bit now

    We just needs to be alive to see that of course, but it will happen . So in about that 2050. we won't see 64bit used much anymore just like now 32bit. and in 2058. we (or if not we next gen will ) would see 64bit only in museums

    Leave a comment:


  • calc
    replied
    Originally posted by Vistaus View Post
    There's also no reason for laptops aimed at casual users (i.e. web, office and mail and not much else) to have 8 GB of RAM yet it's becoming more and more popular to do that. So they will always find a reason to do something, even to move to 128-bit. Also, I think by the time 128-bit truly becomes a thing, 32-bit will not even be considered as an extension.
    You might be surprised at how much memory a few open modern webpages take. Facebook, Google Docs, etc take huge amounts of memory. I wouldn't recommend anyone get anything less than 8GB RAM in a system. Even the oldest system I still have, which is 10 years old, has 8GB in it, which was the most it could take. My newer systems all have 32GB and I'm frequently using over half of that. Pretty much any system other than the cheapest at somewhere like Wal-Mart has at least 8GB in it, a friend even gave me a system they didn't want anymore that had 12GB.

    Leave a comment:


  • Vistaus
    replied
    Originally posted by leipero View Post
    dungeon I understand marketing, but there is no real reason to move from 64-bit afaik, and making compatibility with new extensions for both 64 and 32 bit makes it even more complicated that I don't see it anyone doing it for that purpose.
    There's also no reason for laptops aimed at casual users (i.e. web, office and mail and not much else) to have 8 GB of RAM yet it's becoming more and more popular to do that. So they will always find a reason to do something, even to move to 128-bit. Also, I think by the time 128-bit truly becomes a thing, 32-bit will not even be considered as an extension.

    Also, IBM already has 128-bit high-end computers. Yes, I know that they're not aimed at people like us, but it does show that 128-bit already exists.

    Leave a comment:


  • DrYak
    replied
    Originally posted by starshipeleven View Post
    For the love of Zod why using CRT monitors? There are bazillions of LCD screens with VGA ports.
    LCDs have a fixed resolution.
    Cheap LCDs with VGA ports tend to do very simple tricks when the input resolution doesn't match the output resolution, like only doing "nearest neighbour".

    Games tend to use a bazillion of different resolutions. Even more from small independent makers who have some "demoscene" roots and who tend to reprogram the resolution manually.
    None of these are exact divisor of the LCD division, so you'll have "weird uneven zoomed pixels".

    Also, at some point in time you'll want to reboot the old "legacy/vintage" gaming PC into an OS with network connection, if nothing else, for the purpose to download/upload stuff (newly discovered/obtained legacy games to try) from your NAS.

    Linux is such a possible OS. (The other solution is daring to connect an old/not patched anymore Windows to the network, or try to do some monstruosity under FreeDOS involving packter drivers, Arachne and a home web server).


    Of course, a yet different solution, is to use a CompactFlash -to-IDE pin adapter whose CF card slot is extrenally accessible. In that case you shut down your legacy machine, yank the CF card out, and plugit into a USB card reader (or a CF to PCCard16bit/PCMCIA pin adapter, if your laptop still support those).
    That's what some of the devs of the 8088mph vintage PC demo used.


    Leave a comment:


  • leipero
    replied
    calc Also if you want to use native resolution and play the game as it was "ment to be played", LCD's are out of question, since my last CRT died last year, I find LCD's useless for NES/SNES games, and all the tricks of using OpenGL instead of software drawing and blurs and all sorts of "improvements" just make game look far worse compared to how it should look, plus, on 3 different low-end displays I have a problem with blur that turns "network like" 8-bit textures to look like a c**p when camera is moving (example, fence at first level in first title of Ninja Gaiden/Ryukenden), it just hurt my eyes and there's no way I can solve it (unless buying extremely expensive display that MIGHT solve that problem, but i seriously doubt).

    Leave a comment:


  • calc
    replied
    Originally posted by starshipeleven View Post
    For the love of Zod why using CRT monitors? There are bazillions of LCD screens with VGA ports.
    There are at least two big reasons for retro gaming I can think of. You can't use a lightgun with a LCD (old Nintendo/Sega/etc consoles) and LCDs have much more lag than CRTs.

    Leave a comment:


  • leipero
    replied
    dungeon Yes, but that is prediction "at historic rates", and in last decade plus it's not stagnant but it did slow down substantionaly. If I understand it corectly supercomputers use modular approach, so really it should not be required at all (there's always a posibility that I do not understand it in the right way). My point here is that there are real physical limitations approaching 5nm regardless of materials used, so what will come next, it's hard to say, if it's quantum computing, then it will require completely different approach in memory addressing, so whole talk about X-bits will become irelevant.

    At current rate of growth, even supercomputers are far away from 18 million terabytes in 2050, let alone 2030, and our little desktops will probably never use that amount of memory in billion years .

    Leave a comment:


  • dungeon
    replied
    Originally posted by leipero View Post
    dungeon I understand marketing, but there is no real reason to move from 64-bit afaik, and making compatibility with new extensions for both 64 and 32 bit makes it even more complicated that I don't see it anyone doing it for that purpose.
    RiscV already have 128bit, Linux is most used on supercomputers and if supercomputers feel that they are capped in progression... what you think will happen?. From RiscV papers:

    "At historic rates of growth, it is possible that greater than 64 bits of address space might be required before 2030."
    Is that enough reasoning? These pushovers does not happen just because me or you are not capped on our Desktops
    Last edited by dungeon; 06 May 2018, 09:01 PM.

    Leave a comment:

Working...
X