Announcement

Collapse
No announcement yet.

It's Past Time To Stop Using egrep & fgrep Commands, Per GNU grep 3.8

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • oiaohm
    replied
    Originally posted by ssokolow View Post
    I haven't had a chance to get familiar with the Greaseweazle I bought yet and I can't build a FluxEngine until more of the requisite FPGA dev board are available (and I need to set an eBay watch for a third 5.25" drive at an affordable price), though I am in the process of assembling three housings for them. (Yeah, I'm a bit of a collector when it comes to dumping tools. I also own two cartridge dumpers, neither of which is a Sanni because I haven't had time to get comfortable working with surface-mount components.)
    https://prog.world/recovering-lost-d...-oscilloscope/
    https://scarybeastsecurity.blogspot....ed-floppy.html

    Greaseweazle is good tool. There is the final last option of having a Greeseweazle or FluxEngine control the drive and use oscilloscope on the head data in the drive itself for the damaged sections.

    Yes oscilloscope option allows you with floppies to pull data out of what would be classed as bad sectors.

    There are basically 3 levels here.
    1) Normal floppy controller this gets you like 90% there with most PC floppy discs.
    2) Items like Greaseweazle and FluxEngine these are just controller modifications to have the best controller possible to have the drive read as many discs as it can and do the best it can by the normal drive cable.
    3) oscilloscope connected into the diagnostic points of the floppy drive to access the raw head data. You still need item like a Greaseweazle or fluxengine to be able to control the drive and align with what the oscilloscope is collecting.

    Yes oscilloscope method model and make of drive does get important. Yes some drives its reverse the PCB design to work out where to connect in to get the raw head data.

    This might seam like a new 2021 method with the oscilloscope but in reality its a very old method used by data recovery firms with there custom floppy controllers back in the day.(people having a single copy of something ultra important is not a new problem) The reality if greaseweazle fails its on floppy normally damage disc. Correct due care the data off a floppy at this point is still 100 percent recoverable as long as the surface has not been destroyed or magnetically cleared. Most common cause of damaged floppy surface is contamination on the disc yes destroy disc and head at the same time.

    That right the most common reason why floppy is dead is that someone put floppy in drive to see if it dead or not without cleaning it. Cared for floppy discs should be readable in theory for another 40 to 50 years yet. Yes that is floppies from 1967 that should be still readable at least for another 40 to 50 years.

    Cared for floppy discs appear to have roughly 70-80 year life spans with original designed controllers. So floppy turns out to be quite a good archive media if handled correctly.

    Yes people have stories of floppy and floppy drive failures but they did not understand the problem is contamination by dirt grime and so on more than anything else by massive margin. So that old floppy in the bottom of a draw you find is mostly like data good but horribly dirty and by attempting to read it without cleaning the result could be destroyed drive head and data now bad because of destroyed surface.

    Floppy discs with correct handling having 99% successful read rate without needing more than Greaseweazle/fluxengine is nothing strange and is going be that way for quite a while yet. Floppy discs having a roughly 70 percent destroyed rate with a 5 percent drive head destroyed when incorrect handle is also nothing strange either. Floppy the great double side sword where more hurry less speed really does apply except its more hurry no data.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by oiaohm View Post
    ssokolow the 15 dollar drive floppy controllers hardware are not designed for read old discs. I will give you they give you very decent odds. Greaseweazle and fluxengine and items like it are in fact designed for the job. The reality is items like Greaseweazle and fluxengine with good drive will be able to read discs when 15 dollar USB drives will not even read them.
    Well, the first one I bought read 95% of my 3.5" 720K/1.44MB MFM floppies, the second one read all the remaining ones that didn't have true bad sectors or binder failure but had trouble with many of the ones the first one read fine. When I finally saved up for a KryoFlux, it archived my one copy-protected 3.5" game just fine, though I haven't managed to get a working dump of some of my 400/800K Zoned-CAV GCR Mac floppies yet.

    I haven't had a chance to get familiar with the Greaseweazle I bought yet and I can't build a FluxEngine until more of the requisite FPGA dev board are available (and I need to set an eBay watch for a third 5.25" drive at an affordable price), though I am in the process of assembling three housings for them. (Yeah, I'm a bit of a collector when it comes to dumping tools. I also own two cartridge dumpers, neither of which is a Sanni because I haven't had time to get comfortable working with surface-mount components.)
    Last edited by ssokolow; 11 September 2022, 10:35 AM.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by coder View Post
    Heh, that reminds me that I still have a few books with floppy disks that are surely now unreadable.
    This is storage and effort. If you have a clean drive with head being in decent condition and the disc is self is still clean and you have fluxengine or Greaseweazle hardware and software the odds is that is fully readable.

    Old floppy drives are very analog yes fluxengine and Greaseweazle are basically very advanced floppy controllers.

    Yes what ssokolow said that there are minor differences between drives. Items like fluxengine and Greaseweazle over come these differences as long as the drive is still correctly clean and functional. Yes the more advanced floppy controllers can read floppies that put in a normal drive with a normal controller will be 100 percent unreadable.

    Biggest problem with reading floppy drives is that they are clean and have been stored clean. With the correct hardware the biggest reason why you don't get data back off a physical floppy is physical damage from dirt contamination or screwed up cleaning. Yes cleaning floppy completely you have to carefully disassemble the disc in clean room and clean each part. Yes the grime collection layers inside a floppy are a double sided issue.

    ssokolow the 15 dollar drive floppy controllers hardware are not designed for read old discs. I will give you they give you very decent odds. Greaseweazle and fluxengine and items like it are in fact designed for the job. The reality is items like Greaseweazle and fluxengine with good drive will be able to read discs when 15 dollar USB drives will not even read them.

    Coder this is question of how much effort you are willing to put in. Lot of discs people think are unreadable with the right tools are still perfectly readable.

    Also I have not got into damaged. People have managed to data recover lot of damaged floppy discs that came with books. Mostly by being able to collect up from different people and have multi versions of the same disc. Yes using items like Greaseweazle and fluxengine to read all the discs into images then make a image with what ever values match. The old school mass production of floppy images for books is a cloning process so the data was all put on the discs the same way and this is why this process works. This is again effort. Mass produced physical media turns out to be very good archiving media even if the media itself is not that great. Problem here it coming less and less cost effective to make the physical media in mass.

    One off productions are not ideal for long term data preservation. Why because the data between copies is done differently so making between harder to impossible the process of take many and solve to what the original data was. Harder is like you have two sets of debian of the same version from two different production vendors or disc types and you are attempting to solve the contents of a particular deb file it possible. Impossible is you only have 1 copy total.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by coder View Post
    Wow. The only time I ever did anything like that was to copy & give a CD-ROM from a book to someone who maintains an online archive of them.

    Heh, that reminds me that I still have a few books with floppy disks that are surely now unreadable.
    You'd be surprised. Whether they have bad sectors or have failed is a matter of original manufacturing quality and how they were stored, but floppies are, in general, surprisingly resilient.

    The vast majority of my 3.5" floppies still read perfectly in a simple $15 USB floppy drive (though, if you want to try that, I suggest getting two since they're more sensitive to minor calibration differences.) and, once I got myself a USB floppy controller capable of using original drives and a 5.25" drive, I found that, while they were more variable (likely because of how they had been stored), my childhood 5.25" disks were also often perfectly readable.

    (I say "how they'd been stored because I had to rescue a bunch of them from being art supplies that had literally been sitting stacked under a magnetic screw tray. As the Mythbusters found, if you just place the magnet rather than swiping it, it's much less effective.)

    Leave a comment:


  • ssokolow
    replied
    Originally posted by oiaohm View Post
    Like I am not saying you should attempt to replace Walnut Creek. Walnut Creek is only example there is a fundamental problem. We had a time frame you buy software it came on pressed CD that had quite a life span. You had more press items shipping with pressed CDs. Yes like you people who made their own CD-Rs these have a shorter lifespan than pressed CD/DVD but generally longer life span than harddrive or usb drive.

    [...]
    Ahh. I can fully agree with you there.

    Originally posted by oiaohm View Post
    Asking flathub to keep software for 10 to 20 years provides no protection if someone happens to flathub. The reality is personal archives are quite important things reduce damage from single instance failure.
    My perspective on Flathub is purely a matter of "For maximum preservation, everyone should be equally responsible for keeping functioning archives of software they touch".​​ That way, there are multiple points that have to fail for all copies to be lost.

    Originally posted by oiaohm View Post
    Nightmare problem here parties selling software in lot of ways have a interest in making software lifespan limited.
    Thankfully, we're starting to see glimmers of a solution, with an EU ruling already having come out that said "Right to Repair trumps the license and DRM if you've been promised a feature that's missing or broken."

    Leave a comment:


  • coder
    replied
    Originally posted by ssokolow View Post
    I go through my old CD-Rs and upload otherwise potentially lost freeware/shareware to the Internet Archive as my time permits.
    Wow. The only time I ever did anything like that was to copy & give a CD-ROM from a book to someone who maintains an online archive of them.

    Heh, that reminds me that I still have a few books with floppy disks that are surely now unreadable.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by ssokolow View Post
    "The best of a bunch of bad options" is still "the best".
    ​Yes it might be the best but SDL 2 shows there could be better.

    Originally posted by ssokolow View Post
    I'm honestly not sure what you're even arguing anymore. I wasn't even ten years old when the Walnut Creed CDs in question were being pressed and I don't have the time or the resources to become a new Walnut Creek, so it's clearly not you arguing that it's my responsibility when my point was that we have a responsibility to future "8 years old when that was being done" people.

    I go through my old CD-Rs and upload otherwise potentially lost freeware/shareware to the Internet Archive as my time permits.
    Like I am not saying you should attempt to replace Walnut Creek. Walnut Creek is only example there is a fundamental problem. We had a time frame you buy software it came on pressed CD that had quite a life span. You had more press items shipping with pressed CDs. Yes like you people who made their own CD-Rs these have a shorter lifespan than pressed CD/DVD but generally longer life span than harddrive or usb drive.

    Think ssokolow lot of new computers are now made without cdrom/dvd/bluray drives. We don't have rom forms of USB drives in the market.

    8 years into the future for all we know something in that time could happen so removing the internet archive from access.

    flatpak create-usb made repository can be placed on read only media after its been created.

    Lot of people would not think when they were buying software on cd/dvd/bluray or burning their own that they were part of archiving system. This distributed archiving system is why you and other are able to go back though their own personal collections and put stuff on Internet archive that is long gone otherwise

    https://www.debian.org/CD/vendors/ even something as big as debian we are seeing the vendors of debian on long term media of the complete archive disappearing. Lot of people don't want to pay 300+ dollars for full copy of debian on disc.

    ssokolow my problem here is people are not paying for archiving and I don't know how to fix it. It simple to think we will push that problem off onto flathub or so on but they have to be able to pay for the archiving process and this is also putting eggs in one basket if something go wrong there is going to be problems.

    The pure online delivery of software has a fundamental long term problem. Yes gamers have been seeing games that they paid for come useless because the game servers provide by the maker of the game were shut down. Think when I was growing up if you did not have game server to run game locally you did not buy it.

    Flatpak is designed to be archival particular with the fact you can make install media from local or remote repo instances. Flatpak is different to a RPM or DEB or NixOS in this regard. Flatpak design is where ever the application is installed is basically copy of install media that you can make new install media from and that install media is valid to be read only. What is missing here is made snapshots and end user archiving to make the process not as exposed to single point failure.

    The reality here we could be going into equal to a digital dark ages where those 10 to 20 years in the future cannot use lot software from the past because no functional archive copies exist.

    ssokolow I don't disagree that we should have a responsibility to the future. Problem here to perform that responsibility to the future there need to be money so people can invest their time to in fact do it. There need to legal systems to allow it. With all forms of archiving you need to avoid the hit by bus problem if it going to last long term. Items like Walnut Creek make looking back in history possible. Problem is we are going into a future without some equal replacements to Walnut Creek, Simtel .... and there is no long market to fund their existence. Large Linux distributions providers on dvd and blurays are disappearing as well due to the same problem lack of markets.

    ssokolow it good that you are doing what you are doing with internet archive you need to remember what made that possible. As you said you are going though your CD-R and uploading others are doing this. What you are using is a personal archive of software. People having personal archives of software is getting less remember percentage of these archives would be naturally created in the past when you bought software or hardware with installer discs that is not happening any more. Yes people now look and ask what are the terms of places like flathub in the process completely ignoring they don't have personal software archive. Yes also you have some who upload to the internet archive then get rid of their own personal software archive.

    The loss of personal software archives means if major online software archives go away for some reason the means to rebuild those archives could be gone as well.

    Asking flathub to keep software for 10 to 20 years provides no protection if someone happens to flathub. The reality is personal archives are quite important things reduce damage from single instance failure.

    This is a long post but I don't have any good solution to the problem. I can see the problem coming but no solution. Yes it get worse. Think how much software is online only that companies have turned off servers resulting in the software being no more.

    Nightmare problem here parties selling software in lot of ways have a interest in making software lifespan limited.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by oiaohm View Post
    The problem here by static linking you don't 100 percent fix that problem. Musl libc there are still cases where you can get caught out by kernel version differences.
    "The best of a bunch of bad options" is still "the best".

    Originally posted by oiaohm View Post
    "flatpak create-usb" will create that from the already downloads repositories. One of the reasons why when you install application that application cannot modify the application installed files is so that the local flatpak repos are basically backups.

    Do note those shovelware CD-Roms were not being made by the source locations. Flatpak design fully support someone deciding to make disc archives. Flatpak create-usb could be for USB/DVD/.... something not internet dependent.

    The problem now is who is going to pay for the archiving. You use to pay for Walnut Creek cds that paid for the archiving. This is the problem here not flathub as such but the reality we have lost the market for groups like Walnut Creek to make a profit while producing items that would be long term archives.

    Flatpak design is archiving compatible. Yes flatpak design is way more archiving compatible than a rpm or deb. The problem here is not if flathub will keep stuff for 10+ years... The reality internet service can be 100 percent sure if it will exist in 10 years time. Now if the contents of the service is correctly archived and distributed you can be sure it will be their in 10 years.

    Flatpak is closer to ideal than items like snap but I not going to say its perfect when it not. Yes it still open problem with flatpak how to make archiving applications profitable/performed. Of course this is better than the snap location.
    I'm honestly not sure what you're even arguing anymore. I wasn't even ten years old when the Walnut Creed CDs in question were being pressed and I don't have the time or the resources to become a new Walnut Creek, so it's clearly not you arguing that it's my responsibility when my point was that we have a responsibility to future "8 years old when that was being done" people.

    I go through my old CD-Rs and upload otherwise potentially lost freeware/shareware to the Internet Archive as my time permits.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by ssokolow View Post
    The difference is that I'm neither distributing these things as closed-source software nor linking against high-level APIs like audio or X11. I have the source handy and, if I distribute them, it's as source.

    This is just me giving the middle finger to "ERROR: Go wait for that to recompile and then SSH it over again. You didn't think you'd ever be running a Debian stable machine, so you compiled that on your daily driver Kubuntu machine with a newer glibc."​

    The problem here by static linking you don't 100 percent fix that problem. Musl libc there are still cases where you can get caught out by kernel version differences.

    Originally posted by ssokolow View Post
    That assumes someone else has backed it up. I'm coming at this from the POV of someone who's having to dig through the Wayback Machine and uploads of Walnut Creek "stuff scraped off BBSes" shovelware CD-ROMs for freeware/public domain libraries and reference materials for my DOS programming hobby.
    "flatpak create-usb" will create that from the already downloads repositories. One of the reasons why when you install application that application cannot modify the application installed files is so that the local flatpak repos are basically backups.

    Do note those shovelware CD-Roms were not being made by the source locations. Flatpak design fully support someone deciding to make disc archives. Flatpak create-usb could be for USB/DVD/.... something not internet dependent.

    The problem now is who is going to pay for the archiving. You use to pay for Walnut Creek cds that paid for the archiving. This is the problem here not flathub as such but the reality we have lost the market for groups like Walnut Creek to make a profit while producing items that would be long term archives.

    Flatpak design is archiving compatible. Yes flatpak design is way more archiving compatible than a rpm or deb. The problem here is not if flathub will keep stuff for 10+ years... The reality internet service can be 100 percent sure if it will exist in 10 years time. Now if the contents of the service is correctly archived and distributed you can be sure it will be their in 10 years.

    Flatpak is closer to ideal than items like snap but I not going to say its perfect when it not. Yes it still open problem with flatpak how to make archiving applications profitable/performed. Of course this is better than the snap location.

    Leave a comment:


  • ssokolow
    replied
    Originally posted by oiaohm View Post
    What you are doing with -musl to static link is wrong if you know history. https://sdl-mirror.readthedocs.io/en...ME-dynapi.html SDL 2.0 dynapi like feature is something you should want when you static link stuff in. I noted before the Linux kernel does at times remove very old syscalls this means you need the ability to replace the libraries that directly interface with kernel at some point in the future does change. Programs that use dynamic linking there are ways to update the libraries if required same with programs that have solution like SDL dynapi. Lot of ways old school static linking need to go away for core interfacing libraries and replaced with something like SDL dynapi. Yes the SDL 1.2 compatibility layer to SDL 2.0 is only really useful because SDL 1.2 applications 99% of the time are dynamic linked to avoid LGPL issues. Yes the SDL 1.2 compatiblity layer to SDL 2.0 is now need because lot of old SDL 1.2 applications that are closed source binaries no longer work due to system changes.
    The difference is that I'm neither distributing these things as closed-source software nor linking against high-level APIs like audio or X11. I have the source handy and, if I distribute them, it's as source.

    This is just me giving the middle finger to "ERROR: Go wait for that to recompile and then SSH it over again. You didn't think you'd ever be running a Debian stable machine, so you compiled that on your daily driver Kubuntu machine with a newer glibc."​

    Originally posted by oiaohm View Post
    From the flatpak run command.

    The reality of flatpak is that by default it will run the application with with whatever runtime is defined in the applications metadata include out of date runtimes. Yes flatpak application will warn you that you have installed out of date runtimes. Of course flatpak includes method to allow user to force run times.

    Flathub themselves don't need keep unmaintained stuff around forever. Yes when hosting applications in your own flathub repository you should also host the matching run-times. Lets say you have the same runtime in two different repositories with flatpak user will be asked what one to install.

    https://blogs.gnome.org/mclasen/2018...installations/ Note the flatpak --verbose create-usb here that all the parts for the application are bundled up.

    Big difference between snap and flatpak is the means to self host in flatbak so you are not locked to what ever flathub policies in future happen to be. This is also the thing to remember flathub could promise 10-20 years of support and in few years time something goes wrong and they cease to be operating service then what. At least with flathub other parties can spin up replacements or have your own made install media/servers to reinstall from. Snap if the snapstore goes away what are you going todo.
    That assumes someone else has backed it up. I'm coming at this from the POV of someone who's having to dig through the Wayback Machine and uploads of Walnut Creek "stuff scraped off BBSes" shovelware CD-ROMs for freeware/public domain libraries and reference materials for my DOS programming hobby.

    Leave a comment:

Working...
X