Announcement

Collapse
No announcement yet.

Ubuntu 19.10 To Drop 32-bit x86 Packages

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by L_A_G View Post
    Most software serves a practical purpose and outside of legacy software it, and the other software that it relies on, changes over time. A format like MP3 on the other hand is standardized and set in stone before it even starts seeing widespread adoption meaning that as long as you haven't screwed up somewhere there's no additional cost to continued support.
    What's the difference between a library that decodes mp3 and a bunch of libraries for 32-bit support that you have anyway for 64-bit and just need to rebuild?

    The mp3 library has extra code.

    Originally posted by L_A_G View Post
    Yes, they run on today's hardware because of a very large duplication of effort where loads of libraries and supporting software is made available as both i386 and x86_64 binaries. My point is that outside of niche applications like legacy software that can be containerized in some form or run in VMs this simply a waste of limited developer resources.
    I don't give a fuck what you consider niche. I don't use mp3 because it's inferior so is that niche as well?

    Originally posted by L_A_G View Post
    Like CD+cassette players and VHS+DVD players this is all ultimately unnecessary bulk and maintenance work for very limited benefit when VMs and containerization solves the issue of legacy software and the developers getting off their backsides solves the issue for software that's still being developed. Once legacy software has been containerized/VM'd it's no longer going to require any additional maintenance work and neither is in-development software after it's fully x86_64.
    And if you need mp3 support then just use a VM with an old OS that has mp3 decoding libraries. Same logic.

    Originally posted by L_A_G View Post
    Not only is the amount of additional hardware and software required to play CDs on a blu-ray player negligible
    Again your opinion.

    Originally posted by L_A_G View Post
    the standards for the discs are set in stone.
    32-bit x86 ISA is set in stone and has been for a very long time. It does evolve but it's backwards compatible so it won't break existing apps (you don't have to use the new features).

    Originally posted by L_A_G View Post
    Supporting CDs, which have followed the exact same set of standards since the standards were written in the 1980s, only requires you to put in the effort once and then support's just going to be there perpetuity.
    You're joking right? You literally need to change the laser to read a CD on a Blu-Ray player.

    Originally posted by L_A_G View Post
    With i386 software on the other hand the libraries that non-legacy software are going to be using need to be maintained meaning that unlike supporting CDs in blu-ray players there is a continuing cost of doing so.
    Yeah like I said, retarded. You maintain those libraries anyway because they fucking exist for 64-bit already.

    Originally posted by L_A_G View Post
    The fact that it hasn't been made obsolete
    LMFAO.

    We have ogg vorbis, opus, AAC, and you think mp3 is not obsolete in terms of technical prowess.

    Yeah, talking to a wall here.

    Originally posted by L_A_G View Post
    and that pretty much everything under the sun supports it perfectly with barely any maintenance burden whatsoever?
    Same with 32-bit? Your imaginary maintenance burden is the exact same for mp3. In fact it's worse, you need extra code for it. For 32-bit libraries, it's usually the same library that has a 64-bit version so you just need to recompile.

    Comment


    • Originally posted by Weasel View Post
      What's the difference between a library that decodes mp3 and a bunch of libraries for 32-bit support that you have anyway for 64-bit and just need to rebuild?
      If you think all you need is just to rebuild and that's it then you've obviously never worked with any code that's complex or subject to quality standards (i.e you need to ensure it every build works properly). That's also before the additional space wasted on having multiple binaries of the same code.

      The mp3 library has extra code.
      Yes, but not something that's constantly changing as it's implementing a specification that was standardized back in the 1990s. How many times do I have to point out that maintaining additional builds of the same complex and ever changing codebase is not free?

      I don't give a fuck what you consider niche. I don't use mp3 because it's inferior so is that niche as well?
      So how much software do you use that's no longer maintained? Because I'm pretty sure that most software that's used, particularly in the open source world, is software that is being actively maintained.

      And if you need mp3 support then just use a VM with an old OS that has mp3 decoding libraries. Same logic.
      Unlike 32 bit builds of various libraries and utilities that are still in active development, any halfway decent implementation of an MP3 decoder has zero maintenance cost and won't take up more than maybe a dozen KB of disk space. So unlike unnecessary 32 bit builds of libraries, removing it makes zero sense.

      Again your opinion.
      Yet you still act like your opinion is an argument... Hypocrisy much?

      32-bit x86 ISA is set in stone and has been for a very long time. It does evolve but it's backwards compatible so it won't break existing apps (you don't have to use the new features).
      The ISA may not be changing, but software does and thus thus there is a maintenance cost for every build you're maintaining. As I said, the MP3 spec was set in stone back in the 90s so it's obvious that any halfway decent implementation will have been honed to the point there is zero reason to touch it and hence any maintenance cost will be negligible.

      You're joking right? You literally need to change the laser to read a CD on a Blu-Ray player.
      Yes, a secondary red laser that is put in at manufacture time and then left there with no need to touch it for the lifetime of the player. Lasers in the early to mid 90s may not have been completely reliable, but manufacturing methods have advanced to the point that the laser is going to outlast almost every single moving part, none of which are going to be just for red laser use.

      Yeah like I said, retarded. You maintain those libraries anyway because they fucking exist for 64-bit already.
      Right... So you don't understand the concept of maintaining multiple builds of the same software yet you feel the need to throw around insults like a 12-year-old. Must be nice living in your mother's basement.

      LMFAO.

      We have ogg vorbis, opus, AAC, and you think mp3 is not obsolete in terms of technical prowess.
      The thing about rendering things obsolete is that you first need to create something substantially better that people abandon that thing for it. LaserDisk was a significant step up from VHS in many regards, but you can't exactly say that it was ever made obsolete by it. No, it just never expanded beyond a marginal enthusiast format that was made obsolete by DVD together with VHS.

      Yeah, talking to a wall here.
      Well look who's talking...

      Same with 32-bit? Your imaginary maintenance burden is the exact same for mp3. In fact it's worse, you need extra code for it. For 32-bit libraries, it's usually the same library that has a 64-bit version so you just need to recompile.
      As I keep saying, if a codebase doesn't change there's no reason to run any QA tests on it. If your code is as complex as most system libraries and utilities are in a modern OS you absolutely do need to run proper QA on every single build you're going to be distributing.

      We're not talking about some hobby projects here. We're talking about code that's used in production and upon which a lot of important software people's jobs rely on are built. As such every build of the software needs to go trough proper QA and you can't just trust it'll work perfectly on every ISA it's built for if it works on one of them.

      This is not amateur hour, this is serious code used in production by paying customers. The standards you apply to your little hobby projects drawing ASCII penises in the terminal simply do not apply.
      "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

      Comment


      • Originally posted by L_A_G View Post
        Yes, but not something that's constantly changing as it's implementing a specification that was standardized back in the 1990s. How many times do I have to point out that maintaining additional builds of the same complex and ever changing codebase is not free?
        If the mp3 library is not changing then why are the other libraries changing? There's nothing to maintain if they don't change.

        Originally posted by L_A_G View Post
        So how much software do you use that's no longer maintained?
        A lot. Maybe half. Because it just works. Half of it isn't even open source to begin with (though it's likely free).

        Originally posted by L_A_G View Post
        The ISA may not be changing, but software does and thus thus there is a maintenance cost for every build you're maintaining. As I said, the MP3 spec was set in stone back in the 90s so it's obvious that any halfway decent implementation will have been honed to the point there is zero reason to touch it and hence any maintenance cost will be negligible.
        You obviously have no idea what you're talking about and how many different mp3 implementations there are, so I won't bother. And I'm not even talking about superficial differences but for encoders especially. The spec has nothing to do with implementation.

        Originally posted by L_A_G View Post
        Yes, a secondary red laser that is put in at manufacture time and then left there with no need to touch it for the lifetime of the player. Lasers in the early to mid 90s may not have been completely reliable, but manufacturing methods have advanced to the point that the laser is going to outlast almost every single moving part, none of which are going to be just for red laser use.
        WTF is wrong with you? You think a physical piece doesn't need testing? Maintenance costs? But an automated build of a library has more costs? Are you trolling now?

        Originally posted by L_A_G View Post
        Right... So you don't understand the concept of maintaining multiple builds
        Yeah very much maintenance especially with automated builds compared to a physical hardware like a laser.

        This is really pointless. No matter how sane analogies are presented, all you bring up is your superficial "maintenance cost". In fact, you can just use that phrase to counter anything at this point, it doesn't even matter how idiotic it is.

        Originally posted by L_A_G View Post
        As I keep saying, if a codebase doesn't change there's no reason to run any QA tests on it. If your code is as complex as most system libraries and utilities are in a modern OS you absolutely do need to run proper QA on every single build you're going to be distributing.

        We're not talking about some hobby projects here. We're talking about code that's used in production and upon which a lot of important software people's jobs rely on are built. As such every build of the software needs to go trough proper QA and you can't just trust it'll work perfectly on every ISA it's built for if it works on one of them.

        This is not amateur hour, this is serious code used in production by paying customers. The standards you apply to your little hobby projects drawing ASCII penises in the terminal simply do not apply.
        Because totally rendering stuff unable to run is better than have it crash due to lack of QA. Right. 10 IQ Logic.

        Comment


        • Originally posted by Weasel View Post
          If the mp3 library is not changing then why are the other libraries changing? There's nothing to maintain if they don't change.
          We're not talking about abandonware libraries here, we're talking about libraries that are obviously in both active development and active use by software that's also in active development. Meaning that there's going to be bugs discovered, optimizations and general improvements to be made. MP3 decoding on the other hand is such old hat and performing a completely standardized task that there's really no reason to re-implement it or even touch existing implementations as those have reached a point where there's really no point in trying to improve them anymore.

          A lot. Maybe half. Because it just works. Half of it isn't even open source to begin with (though it's likely free).
          Well then you're obviously an outlier and even at that you only need to containerize it or set up an i386 VM once as it's obviously not going to change.

          You obviously have no idea what you're talking about and how many different mp3 implementations there are, so I won't bother. And I'm not even talking about superficial differences but for encoders especially. The spec has nothing to do with implementation.
          Missing the point again? When you have a spec set in stone there is zero reason to change a working and well made implementation. The fact that there are many different implementations neither escapes me nor counters my argument in any way as any decently put together implementation will have gotten to the point of no longer needing to be touched.

          When you have a spec that's been unchanged for over a quarter of a century you're obviously going to have any eagerness to experiment out of your system and arrived at a good working implementation that just doesn't need to be touched anymore. Hence any halfway decent MP3 decoder will have long since arrived at the point where there's just not going to be any maintenance cost to it.

          WTF is wrong with you? You think a physical piece doesn't need testing? Maintenance costs? But an automated build of a library has more costs? Are you trolling now?
          A physical piece with no moving parts needs to be tested once and if it's as bomb proof as a modern CO2 laser that one test is really all you're ever going to need. By the time that laser will need to be replaced it will be long past the end of the usable life of the player itself due to wear and tear on the moving parts, none of which will be solely used by functionality related to the red CO2 laser.

          Yeah very much maintenance especially with automated builds compared to a physical hardware like a laser.
          If you think a build is never going to break or that the attitude of "If it builds, it's fine" is an acceptable outside of personal hobby projects and academic settings then you've obviously never worked on complicated software or software that's meant to be used in production environments. In an environment like that you simply cannot trust that nothing is going to go wrong like you're doing.

          This is really pointless. No matter how sane analogies are presented, all you bring up is your superficial "maintenance cost". In fact, you can just use that phrase to counter anything at this point, it doesn't even matter how idiotic it is.
          The fact that you trust that a build is never going to break and that if it builds, it's fine, just proves you have no idea of what maintenance for production quality software actually entails.

          Because totally rendering stuff unable to run is better than have it crash due to lack of QA. Right. 10 IQ Logic.
          Rendering it unusable? Maybe if you're computer illiterate and don't know how to containerize it or run it in a VM. Seeing how you've demonstrated that you've got no idea of how production quality software development actually works I probably shouldn't put it past you to not know how to do either of these things.
          "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

          Comment


          • Originally posted by L_A_G View Post
            We're not talking about abandonware libraries here, we're talking about libraries that are obviously in both active development and active use by software that's also in active development. Meaning that there's going to be bugs discovered, optimizations and general improvements to be made. MP3 decoding on the other hand is such old hat and performing a completely standardized task that there's really no reason to re-implement it or even touch existing implementations as those have reached a point where there's really no point in trying to improve them anymore.
            You can always improve the performance by using new CPU features/instructions, or have better quality encodings (not talking just about decoding). Everything you listed "bugs discovered, optimizations and general improvements to be made" apply to the library as a whole, which applies to it anyway because you build it for 64-bit.

            Originally posted by L_A_G View Post
            Missing the point again? When you have a spec set in stone there is zero reason to change a working and well made implementation. The fact that there are many different implementations neither escapes me nor counters my argument in any way as any decently put together implementation will have gotten to the point of no longer needing to be touched.

            When you have a spec that's been unchanged for over a quarter of a century you're obviously going to have any eagerness to experiment out of your system and arrived at a good working implementation that just doesn't need to be touched anymore. Hence any halfway decent MP3 decoder will have long since arrived at the point where there's just not going to be any maintenance cost to it.
            ENCODER broski. And even for decoder you can do performance/energy improvements using new CPU features or simply better algorithms.

            Originally posted by L_A_G View Post
            If you think a build is never going to break or that the attitude of "If it builds, it's fine" is an acceptable outside of personal hobby projects and academic settings then you've obviously never worked on complicated software or software that's meant to be used in production environments. In an environment like that you simply cannot trust that nothing is going to go wrong like you're doing.
            So what's worse: unable to run the software at all due to idiotic decision, or having it "break" because untested?

            I go with the former. At least there's a (high) chance that it will work with the latter.

            Originally posted by L_A_G View Post
            Rendering it unusable? Maybe if you're computer illiterate and don't know how to containerize it or run it in a VM. Seeing how you've demonstrated that you've got no idea of how production quality software development actually works I probably shouldn't put it past you to not know how to do either of these things.
            Generic buzzwords, here have a cookie, superb argument, could say the same thing to you.

            I don't want a container, because these apps could also benefit from improvements to the libraries (like Wine does). I also want full desktop integration, not a single hurdle, meaning when I open a file, I don't want a single trace of the "container" or "portal" or whatever bullshit. Period.

            I mean, it does run that way today... why would I want a degraded user experience?

            Comment


            • Originally posted by Weasel View Post
              You can always improve the performance by using new CPU features/instructions, or have better quality encodings (not talking just about decoding). Everything you listed "bugs discovered, optimizations and general improvements to be made" apply to the library as a whole, which applies to it anyway because you build it for 64-bit.
              As I said, this long in there's just not going to be anything of important left to do with a quarter century old specification. Even if you start putting in improvements for new CPU features, building for i386 will necessitate a whole bunch of extra work with alternative code paths and ensuring that those code paths are always followed correctly.

              ENCODER broski. And even for decoder you can do performance/energy improvements using new CPU features or simply better algorithms.
              It doesn't really matter if we're talking about encoders or decoders. Both implement the same specification that's been unchanged for a quarter century and even if you just try to add support for new instructions additional builds for ISAs like i386 which lack those instructions will still cause additional headaches.

              So what's worse: unable to run the software at all due to idiotic decision, or having it "break" because untested?
              As I said, you're still going to be able to run your obsolete software using methods like containerization or VMs. If you can't containerize it or get a VM going then you issue is one of PEBKAC and not something Canonical is guilty of.

              I go with the former. At least there's a (high) chance that it will work with the latter.
              Software relying on i386 builds of libs and utilities still works because of a significant effort to do so. Once this effort ceases said software needs to be removed as bug will obviously be uncaught and end up lingering, which is obviously not acceptable in software that's sold commercially.

              Generic buzzwords, here have a cookie, superb argument, could say the same thing to you.
              If basic QA and "production" software are just buzzwords for you it's fairly obvious you've never done software development professionally. As such you really shouldn't go around acting like you know anything about producing commercial software used for use by professionals.

              I don't want a container, because these apps could also benefit from improvements to the libraries (like Wine does). I also want full desktop integration, not a single hurdle, meaning when I open a file, I don't want a single trace of the "container" or "portal" or whatever bullshit. Period.
              The fact that the i386 binaries won't be provided by the OS doesn't mean that builds of newer versions aren't going to become available elsewhere. With Wine the developers have said that they'll be providing their own i386 binaries and those will obviously be updated as never versions are released. It'll probably even help them with making a better experience as they can then count on everyone having the exact same version of the libraries rather than the absolute cacophony of versions provided by different distros.

              I mean, it does run that way today... why would I want a degraded user experience?
              Maybe you're not the only person in the world? Maybe those resources are actually better spent on other things? Maybe you're just entitled?
              "Why should I want to make anything up? Life's bad enough as it is without wanting to invent any more of it."

              Comment


              • Y'all are still going at it?

                At least me and starshipeleven shut up about it after a day.

                Arguing is fine, but, damn, y'all's gettin' a bit ridiculous at this point.

                Comment


                • Originally posted by skeevy420 View Post
                  Y'all are still going at it?

                  At least me and starshipeleven shut up about it after a day.

                  Arguing is fine, but, damn, y'all's gettin' a bit ridiculous at this point.


                  Comment


                  • This is hilarious, and the smug responces are even better!

                    1. Seriously stop using Ubuntu like yesterday, they are a bunch of woke elitists who have contributed nearly nothing intentionally.

                    2. This is the real open source community, a bunch of toxic self-hating totalitarians, I regret spending so many working years in this cesspool.

                    I don't much care for the world open source built, where users are more dis-empowered than ever, and their software is written by people who hate them.


                    Last edited by techzilla; 02 April 2020, 01:13 AM.

                    Comment


                    • Originally posted by techzilla View Post
                      I don't much care for the world open source built, where users are more dis-empowered than ever, and their software is written by people who hate them.
                      This make me wonder - were you born and old enough to be using a computer before we got open source?
                      Last edited by zyxxel; 11 April 2020, 06:18 AM. Reason: duplicated word

                      Comment

                      Working...
                      X