Announcement

Collapse
No announcement yet.

Ubuntu 19.10 To Drop 32-bit x86 Packages

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Weasel View Post
    It's simple. No i386 libraries = 32-bit apps won't load.
    And as long as you have a source of those libraries for the application with what ubuntu has done it does not matter. Like you snap or flatpak your need application and now it works again.

    Comment


    • Originally posted by xfcemint View Post
      So, running win16 on Wine is a major use case there, but we should drop win32 on Wine now? Aren't you going against your own position?

      What you said just contributes to the position that backward compatibility is important, therefore Ubuntu shouldn't drop i386 libs.
      Do not put words in my mouth. Win16 is a major use case for Wine. It is not a major use case for Ubuntu or generic Linux users.

      Originally posted by xfcemint View Post
      You are miscalculating badly. Two reasons:

      1. Monthly active users on Steam are 90 million. That means many more have tried Steam and given up, or they don't play games at the given moment. So, the actual number of Steam users is much higher (many times higher) than the mentioned 90 million monthly active users. Therefore, it is likely that Steam usage on Ubuntu exceeds 5%, but we dont know the exact percentage.
      As I wrote, Steam accounts are at 1 billion now. But most of them are inactive or one of several accounts that belong to the same person.

      If people have given up, then Steam is no longer a deciding use case for them.
      Originally posted by xfcemint View Post
      2. You mentioned, in the original post, "32-bit proprietary applications on a desktop". That suddenly got turned into "Steam". Steam does not distribute the the majority of "32-bit proprietary applications on a desktop".
      There may be other users of proprietary 32-bit x86 software, sure. However, Steam and its games was mentioned as major use case for 32-bit x86 proprietary software that is in common use today and will never see an update for 64-bit. While the "never see an update" part is probably true, what we know about the actual numbers not support that dropping Steam would affect more than 5% of Ubuntu users.

      Comment


      • Originally posted by chithanh View Post
        Whatever money they get from ESM, snap store, etc. is not paid directly by customers who want the i386 port I assume. And I agree that what being successful on the consumer desktop made Ubuntu popular for servers, cloud, and enterprise.
        And I assume that they just have no f*cking idea about what they are doing. It has literally nothing to do what their consumers want or not.


        Originally posted by dimesio (Rosanne DiMesio from EarthLink)
        And when they're handing out advice like "Try 64-bit WINE first. Many applications will “just work”," I'd say the people making this decision seem to know less about Wine than a typical Ubuntu user.
        Originally posted by dimesio (Rosanne DiMesio from EarthLink)
        Unfortunately, based on what the Ubuntu article says, I don't have any confidence that they even understand what Wine needs, let alone plan to provide it.
        Originally posted by aeikum (Andrew Eikum from CodeWeavers)
        Yes, I agree. From the FAQ it doesn't seem like they understand Wine's needs.
        Originally posted by chithanh View Post
        And still Ubuntu will be the second most successful desktop distro after Chrome OS even if all Wine/Steam users go away.
        Just like Windows RT was so successful on desktops? ;-)
        I am not saying that Canonical will lose their position on the desktop, but it will certainly hurt them. And if they do not change the approach, they will diminish in the long run.

        Originally posted by chithanh View Post
        How popular Ubuntu is on zSeries/Power has nothing to do with the economic decision of supporting s390/ppc64le. Economic decision means, does the company see sufficient ROI from that.
        Of course it has, because they need customers who are able to pay for it. And they can't just raise prices because they are definitely not the leader here. They have to offer a solution that is cheaper than RHEL and SLE, because otherwise they will be out of business.

        Originally posted by chithanh View Post
        In no way that you can look at it that is supported by facts.

        Steam (all platforms) has 1 billion accounts, but most of them inactive or one of multiple accounts that belong to a single user. Monthly active users are at 90 million. Linux is around 1% of Steam survey respondents, so let's say 900k.
        Global installed base of PCs is 2.2 billion. Installed base of Ubuntu desktop was estimated at 40 million in 2015 (haven't found anything more recent).

        So no way that Steam users on PC or Ubuntu exceed 5%.
        What about this?

        Originally posted by PCGamesN
        35% of Americans are PC gamers
        And WINE is extremely popular on Linux desktop.

        Originally posted by chithanh View Post
        I don't see the analogy. Depending on what is the reason why you run Ubuntu, and whether you have a Windows license, running it inside WSL or inside a virtual machine on Windows is an option or not.
        But how will the reason for you to run Wine determine whether you use e.g. a Snap or a native package?
        I am not big fan of half-working solutions.
        Did you considered creating standalone wine runtime as snap? This way it won't need to create separate snap for every app you want to run (we're talking about thousands of windows apps which can be...


        Moreover, they clearly don't have plans to release Core 20 for i386, so they will stick with old libraries. And this will be really painful in 5-10 years.

        Originally posted by chithanh View Post
        Eh, the whole point of XWayland is to continue running X11 applications in Wayland.
        And the whole point of multilib/multiarch it to continue running 32-bit applications in 64-bit environment.

        Originally posted by chithanh View Post
        There needs not be special XWayland support in GTK+/Qt/... that is the whole point.
        There needs not be special 32-bit support in glibc/libstdc++/... That is the whole point.

        Originally posted by chithanh View Post
        And it is not that XWayland is one heavily developed package. The only time it sees major activity is around adding support for new APIs as I already mentioned.
        And it is not that maintaining i386 packages takes a lot time, because almost all you need to do is to rebuild existing ones.

        Anyway, my point it that if they decide to retire support for X11 (including XWayland), then all packages will be compiled without X11 support. This means that it wouldn't be enough just to provide X11 packages. All of toolkits and other graphics libraries have to be rebuilt with X11 support. And this is not realistic to be done by community in a PPA repo.

        Originally posted by chithanh View Post
        No. Neither Wine nor its sister project ReactOS (which whom significant code sharing happens) development is "driven" by games.
        VK9, DXUP, DXVK, VKD3D, the whole WineD3D as well as Lutris, PlayOnLinux/Phoenicis, GameHub, etc. are driven by games.
        Back in time, there was a program named XWine. It was some kind of WINE manager. Today we have PlayOnLinux that allows us to manage wine version and prefixes. Although it was created with games in mind, it is commonly used for productive software as well.
        WineD3D was created mainly because of games, but today it is very important when it comes to run multimedia software or even normal applications because they heavily use DirectWrite these days.

        Originally posted by chithanh View Post
        What is true that major parts of getting games to work has already been done, and the work of Valve/Proton is just pushing particular fixes that prevent games from running, not implement entirely new functionality (with few exceptions).
        Sorry, but this is not true. A lot of work have to be done to make VKD3D a mature solution. The same applies to the whole WineD3D when it comes to DirectX 11. And even more work needs to be done for other DirectX components (D3DX8, DirectDraw, etc.).




        Originally posted by chithanh View Post
        When Linux was dropping modify_ldt() syscall which Wine depends on for running Win16 protected mode software on x86_64, it was Wine developers and users who objected. Precisely because running Win16 software on 64-bit Linux is still a major use case for Wine users.
        You can't be serious. So compatibility with 32-bit not important, but with 16-bit is? This must be a f*cking joke.

        Originally posted by chithanh View Post
        And yes, running Win16 protected mode software through emulation (as already happens for v86 mode software) or in a VM has been suggested in that very LKML thread, but such talk was quickly shut down by Linus.
        And now Canonical don't give a sh*t about compatibility.
        DebConf 14: QA with Linus TorvaldsYou can find rest of the DebConf14 videos/meetings/lectures/etc on: http://meetings-archive.debian.net/pub/debian-meetings/...

        See the "DebConf 14_ QA with Linus Torvalds" video where Linus addresses some of the issues that AppImageKit tries to solve. At 05:40 Linus highlights a core issue for Linux on the desktop, applica...

        Originally posted by Linus Torvalds
        So you actually want to just compile one binary and have it work. Preferably forever. And preferably across all Linux distributions. And I actually think distributions have done a horribly, horribly bad job. One of the things that I do on the kernel - and I have to fight this every single release and I think it's sad - we have one rule in the kernel, one rule: we don't break userspace. (...) People break userspace, I get really, really angry. (...) And then all the distributions come in and they screw it all up. Because they break binary compatibility left and right. They update glibc and everything breaks. (...) So that's my rant. And that's what I really fundamentally think needs to change for Linux to work on the desktop because you can't have applications writers to do fifteen billion different versions.
        Originally posted by chithanh View Post
        Ok, I think this reveals a major misunderstanding what Ubuntu is and does on top of Debian.

        Ubuntu does maintain their own complete infrastructure and does their own QA on the packages, expending considerable resources. Yes, these packages originally come from Debian. But Ubuntu is not like LMDE/Siduction/Aptosid/etc. which are basically lightly repackaged+reskinned Debian.
        I perfectly know how it works. It is more or less the same situation as between RHEL and Fedora.
        So if there are any problems with i386, they will be resolved by Debian maintainers.
        And don't bullsh*t me about QA. We have plenty examples where Ubuntu inherited Debian bugs. That's what their quality control is worth.

        Comment


        • Originally posted by the_scx View Post
          You can't be serious. So compatibility with 32-bit not important, but with 16-bit is? This must be a f*cking joke.
          That is one of those horrible stupid ones. Companies will pay codeweavers more to make 16 bit stuff work than 32bit stuff. So if you are looking at this from the point of view of Codeweavers bottom line 16 bit is more important. We hope after the Ubuntu and OS X mess that this changes. Money kind of important to pay lead developers todo things.


          Originally posted by the_scx View Post
          And now Canonical don't give a sh*t about compatibility.
          DebConf 14: QA with Linus TorvaldsYou can find rest of the DebConf14 videos/meetings/lectures/etc on: http://meetings-archive.debian.net/pub/debian-meetings/...

          https://github.com/AppImage/AppImage...ssue-109864970
          This is nothing new. Flatpak and snap appear to address these problems by 1 method.

          Originally posted by the_scx View Post
          I perfectly know how it works. It is more or less the same situation as between RHEL and Fedora.
          So if there are any problems with i386, they will be resolved by Debian maintainers.
          And don't bullsh*t me about QA. We have plenty examples where Ubuntu inherited Debian bugs. That's what their quality control is worth.
          https://lwn.net/Articles/282038/
          The debian developer raises a serous point the OpenSSL developer missed. Linux kernel if you use uninitialised the return value is not in fact random. It appears random while you have Linux kernel address randomisation on. Items like some arm based routers using OpenSSL did not have it on.

          Undefined behaviour should in well made encryption should never be presumed to be random. Result from using uninitialised memory is undefined behaviour. So no matter how look at it the Debian developer removed a bug and caused a bug that was particular use cases to come constant everywhere.

          RHEL in fact runs a stack of extra QA after the packages are complete with Fedora before it goes into enterprise developer.

          18.04 Ubuntu has signed contracts for 10 years of support. Longer than debian support on the packages contained so problems with 18.04 will not be able to be passed up to Debian maintainers. So Canonical with Ubuntu has to find the resources from somewhere to take care of these packages. Please note that lwn.net is 10 years before Canonical signs themselves legally into this position. We all know what Ubuntu quality control has been worth but in the past they had not signed contracts where they had to provide quality. They only signed contracts where they had to provide support. Some how I think with 19.10 Canonical has worked out the nightmare they have caused themselves.

          Comment


          • Originally posted by Weasel View Post
            Duplication of effort? Sure, because the mp3 decoding isn't even duplicated, it's literally extra effort.
            It's extra effort you put in when it was somewhat new and when you put in the fundamentals. As long as it's been properly implemented it's not like you're ever going to have to touch it again.

            Wasted disc space: again, the mp3 decoding is code which requires disk space.
            Space spent on MP3 decoders is hardly wasted space when MP3 files are still sold and used (who's still actually still using old i386 hardware let alone selling it?) it and it's just one thing, not the duplication of a whole lot of stuff.

            Performance loss: this doesn't apply in either case.
            I'm pretty sure that with the additional instructions, registers, etc. found in AMD64/x86_64 is going to be faster in non-trivial code if utilized properly. Having written my master's thesis on the subject and worked at my university's HPC lab I know the kind of additional performance you can get from properly utilized modern hardware first hand.

            Actually no, this is the dumbest analogy I keep reading online. Software is data, just like movies. The medium it is stored in is irrelevant. I want to watch a movie from 1990s, say Terminator 2. I can do that, on a standard install. Easy. It doesn't have to be on a DVD, I can archive it on a USB stick, hard drive, or whatever is "modern". I don't give a shit about the original CD installers for old games. Those yes, are like VHS or whatever, since they're medium. I talk about the software itself, the data that you archive.
            I think you're missing the point here...

            To put it as simply as I can: Some people want to keep being able to use old i386 binaries and with AMD64 being a thing this is just going to lead to loads of the same libraries having to duplicate loads of binaries for i386 and AMD64 to so that applications that actually utilize hardware from the last decade can co-exist with hardware that flatly refuses to leave the 1990s. Hence the current situation is much like those DVD+VHS combo machines that were around in the early 2000s or the CD+cassette players that were around in the mid 90s.

            Essentially what Canonical is doing the equivalent of when hardware manufacturers started those awful combo machines in favor of pure DVD and CD players. Obviously there are people stuck in the past who want to continue to watch their VHS tapes, listen to their C-cassettes and run their i386 binaries. However there really isn't any reason to continue tacking on those legacy formats/architectures to modern systems.

            Originally posted by the_scx View Post
            The solution is very simple: just keep support for multiarch/multilib. That's it. They have time and resources to support PPC64LE and s390x on desktop, so they must have time for this too.
            Seeing how you're a bit confused here: Ubuntu has support for IBM's hardware with their SERVER and CLOUD distros. Their desktop distro is only i386 and AMD64/x86_64 and they're now dropping the former. Not only that, the server part of the business is actually generating a decent amount of money so those things actually pay for themselves.

            The reality is that they never have to drop it.
            IBM System i for minicomputers can provide several decades of compatibility.
            Windows 10 provides excellent compatibility with Windows 7, very good with Windows XP and good enough with Windows 95. Why Linux can't?
            As long as Windows will be the dominant system on the desktop and the software for Windows 95-10 will be in use, no one should even think about dropping support for 32-bit software in Linux.
            You're painting a picture of Windows backwards compatibility that's only really compatible with what Microsoft's marketing department likes to claim. We're not talking about multi-million dollar mainframe systems so the comparison to IBM's mainframe stuff is about as badly misplaced as it gets. Not only that, there's nothing stopping people from just running old versions of Linux in a VM for legacy software as Microsoft's legacy support is generally built around very VM-esque implementations.
            Last edited by L_A_G; 24 June 2019, 09:27 AM.

            Comment


            • Originally posted by L_A_G View Post
              It's extra effort you put in when it was somewhat new and when you put in the fundamentals. As long as it's been properly implemented it's not like you're ever going to have to touch it again.
              But you know that the same applies to multiarch support, right?

              Originally posted by L_A_G View Post
              Space spent on MP3 decoders is hardly wasted space when MP3 files are still sold and used
              And applications that use 32-bit components are still sold and used. Even Photoshop CC 2019 (20) requires WoW64 support.
              https://www.phoronix.com/forums/foru...36#post1108036
              https://www.playonlinux.com/en/app-3...p_CC_2019.html
              Here's what happens when you try to run it on a pure 64-bit WINE:
              Code:
              $ file Set-up.exe
              Set-up.exe: PE32 executable (GUI) Intel 80386, for MS Windows
              $ wine Set-up.exe
              wine: Bad EXE format for Z:\home\scx\software\wine\apps\adobe_creative_cloud\AdobePhotoshop20-mul_x64\Set-up.exe.
              Code:
              $ file Creative_Cloud_Set-Up.exe
              Creative_Cloud_Set-Up.exe: PE32 executable (GUI) Intel 80386, for MS Windows, UPX compressed
              $ wine Creative_Cloud_Set-Up.exe
              wine: Bad EXE format for Z:\home\scx\software\wine\apps\adobe_creative_cloud\Creative_Cloud_Set-Up.exe.
              Originally posted by L_A_G View Post
              (who's still actually still using old i386 hardware let alone selling it?)
              The multiarch support has literally nothing to do with the hardware. It is about software. And not only legacy applications, but even modern ones.

              Originally posted by L_A_G View Post
              Seeing how you're a bit confused here: Ubuntu has support for IBM's hardware with their SERVER and CLOUD distros. Their desktop distro is only i386 and AMD64/x86_64 and they're now dropping the former. Not only that, the server part of the business is actually generating a decent amount of money so those things actually pay for themselves.
              <sarcasm>
              And we know that GNOME Shell, GNOME Games, Gnome To Do, Cheese, Rhythmbox, Totem, Shotwell, Simple Scan, Transmission Gtk+, BlueZ, etc., as well as DOSBox, Widelands, and Tux Paint are extremely popular on s390x! Destop software, applications for kids and games on mainframes are super important on mainframes, because that's what people pay them for!
              On the other hand, don't have a penny from Snap Store, nor paid support for x86 desktops. Nobody pay them for ESM (Extended Security Maintenance)!
              </sarcasm>
              My point is that if they have time and resources to support irrelevant software on s390x (I am not talking about server software), then they must have time and resources to support at least minimal multilib support. It cost them almost nothing, but the benefits are super huge!

              Originally posted by L_A_G View Post
              You're painting a picture of Windows backwards compatibility that's only really compatible with what Microsoft's marketing department likes to claim. We're not talking about multi-million dollar mainframe systems so the comparison to IBM's mainframe stuff is about as badly misplaced as it gets. Not only that, there's nothing stopping people from just running old versions of Linux in a VM for legacy software as Microsoft's legacy support is generally built around very VM-esque implementations.
              And we already know that playing games on VM is a terrible experience.
              https://discourse.ubuntu.com/t/resul...an-19-10/11353
              Originally posted by popey
              Game is a black window - suspect this is poor OpenGL support in VirtualBox
              The same applies to running professional software that use OpenCL and CUDA. Without GPU passthrough, it is an extremely stupid idea to run them on VMs.

              BTW: You should listen what Linus Torvalds said about breaking binary compatibility.
              DebConf 14: QA with Linus TorvaldsYou can find rest of the DebConf14 videos/meetings/lectures/etc on: http://meetings-archive.debian.net/pub/debian-meetings/...

              https://github.com/AppImage/AppImage...ssue-109864970
              Originally posted by Linus Torvalds
              So you actually want to just compile one binary and have it work. Preferably forever. And preferably across all Linux distributions. And I actually think distributions have done a horribly, horribly bad job. One of the things that I do on the kernel - and I have to fight this every single release and I think it's sad - we have one rule in the kernel, one rule: we don't break userspace. (...) People break userspace, I get really, really angry. (...) And then all the distributions come in and they screw it all up. Because they break binary compatibility left and right. They update glibc and everything breaks. (...) So that's my rant. And that's what I really fundamentally think needs to change for Linux to work on the desktop because you can't have applications writers to do fifteen billion different versions.
              Last edited by the_scx; 24 June 2019, 01:40 PM.

              Comment


              • Originally posted by the_scx View Post
                But you know that the same applies to multiarch support, right?
                Considering the complexity of it all and the fact that a big chunk of those libraries are not actually static that's not true at all. We're not talking about some old abandonware software here, we're talking about software that's still being developed and used by other actively used and developed software.

                And applications that use 32-bit components are still sold and used. Even Photoshop CC 2019 (20) requires WoW64 support.
                The fact that applications as recent as that still use i386 binaries is more than enough evidence that software developers need more than just a gentle nudge to stop pumping out and using i386 binaries. All you're doing is proving that it's now or never that you really need to become proactive about getting all of this i386 legacy gunk out of the works or you're going to be stuck with it in perpetuity.

                The multiarch support has literally nothing to do with the hardware. It is about software. And not only legacy applications, but even modern ones.
                It's kind of impressive how you completely missed the point there seeing how you're basically trying to answer a rhetorical question. The point I was actually trying to make there was that really isn't any real reason to keep using and particularly continuing to produce new i386 binaries. Obviously barely anybody even runs i386 hardware anymore so moving over to only producing x86_64 binaries is at worst no additional effort.

                The fact that
                <sarcasm>
                <List of desktop software that obviously isn't supported on IBM mainframe hardware>
                </sarcasm>
                Seeing how you don't seem to understand or know this: Regular desktop Ubuntu is not supported on IBM Power/Z mainframe hardware! Not only that, desktop and server Ubuntu are not the same thing. Their intended uses and the stuff that they ship with are very clearly indicated by what they're called and before you try to do a "B-b-b-but cloud ubuntu!"-response that's really just Ubuntu server set up to run in a VM. The whole "cloud" thing really is just about making distributed systems scalable by spinning up additional VMs based on demand/need.

                My point is that if they have time and resources to support irrelevant software on s390x (I am not talking about server software), then they must have time and resources to support at least minimal multilib support. It cost them almost nothing, but the benefits are super huge!
                The thing is that they don't support the software you listed on IBM mainframe hardware. Canonical only supports their Cloud and Server distros on that hardware they're obviously used for

                And we already know that playing games on VM is a terrible experience.
                You can't be this stupid, can you? Trying to actually act as if VirtualBox, VM software really not intended for hardware accelerated software, somehow being representative of virtual machines in general? This is a particularly stupid assertion to make when KVM with GPU passtrough don't just exist, they're well known.

                BTW: You should listen what Linus Torvalds said about breaking binary compatibility.
                Not only is that an appeal to authority fallacy, i.e almost as well known as a straw man, try to remember that he pushed quite heavily for dropping i286 support back in the day so the talk about having binary compatibility "preferably forever" rings kind of hollow.

                Comment


                • Originally posted by L_A_G View Post
                  I think you're missing the point here...

                  To put it as simply as I can: Some people want to keep being able to use old i386 binaries and with AMD64 being a thing this is just going to lead to loads of the same libraries having to duplicate loads of binaries for i386 and AMD64 to so that applications that actually utilize hardware from the last decade can co-exist with hardware that flatly refuses to leave the 1990s. Hence the current situation is much like those DVD+VHS combo machines that were around in the early 2000s or the CD+cassette players that were around in the mid 90s.

                  Essentially what Canonical is doing the equivalent of when hardware manufacturers started those awful combo machines in favor of pure DVD and CD players. Obviously there are people stuck in the past who want to continue to watch their VHS tapes, listen to their C-cassettes and run their i386 binaries. However there really isn't any reason to continue tacking on those legacy formats/architectures to modern systems.
                  Yeah, you're a lost cause, repeat same shit and ignore what I say. What part of software being data you don't understand here?

                  Even more so, using your own analogy, 32-bit applications run just fine on new hardware. They don't run just on hardware from the 1990s, they run on today's hardware, so your analogy is retarded.

                  Here's a better analogy: It's like saying my Blu-Ray player supports reading CDs/DVDs (i.e. the CPU can execute 32-bit code), but the driver refuses to do it because of retarded reasons like yours. In effect, the driver refuses to utilize my player's functionality of reading CDs/DVDs.

                  Yeah, that's retarded buddy.

                  Originally posted by L_A_G View Post
                  It's kind of impressive how you completely missed the point there seeing how you're basically trying to answer a rhetorical question. The point I was actually trying to make there was that really isn't any real reason to keep using and particularly continuing to produce new i386 binaries. Obviously barely anybody even runs i386 hardware anymore so moving over to only producing x86_64 binaries is at worst no additional effort.
                  And what's the reason for people continuing to produce mp3 files?

                  Comment


                  • Originally posted by L_A_G View Post
                    Considering the complexity of it all and the fact that a big chunk of those libraries are not actually static that's not true at all. We're not talking about some old abandonware software here, we're talking about software that's still being developed and used by other actively used and developed software.
                    What a bullshit. Multimedia frameworks are not abandoned too. Both FFmpeg and GStreamer are constantly evolving. From time to time, developers take care about refactoring (i.e. GStreamer 0.10 → 1.0) and improving the code. It is definitely not constant.
                    Mirror of https://git.ffmpeg.org/ffmpeg.git. Contribute to FFmpeg/FFmpeg development by creating an account on GitHub.

                    But for Ubuntu maintainers, it doesn't really matter.
                    We have the same situation in the case of base libraries. Debian maintainers already have done most of the work when it comes to preparing the packages. What is more, after releasing the new version of the distribution, these packages are frozen.

                    Anyway, the whole drama about dropping support for multiarch was pathetic. As you can see, they have time and resources to support it. They only needed a little motivation.

                    Originally posted by L_A_G View Post
                    The fact that applications as recent as that still use i386 binaries is more than enough evidence that software developers need more than just a gentle nudge to stop pumping out and using i386 binaries. All you're doing is proving that it's now or never that you really need to become proactive about getting all of this i386 legacy gunk out of the works or you're going to be stuck with it in perpetuity.
                    Ubuntu developers have no impact on software developers who target the Windows platform.
                    It is like WINE developers could give up supporting DirectX. That would not change anything when it comes to game developers. And it certainly would not cause the rewriting of old titles to Vulkan. It would only hurt Linux users. We have exactly the same situation here when it comes to 32-bit PE32 executables.

                    Originally posted by L_A_G View Post
                    It's kind of impressive how you completely missed the point there seeing how you're basically trying to answer a rhetorical question. The point I was actually trying to make there was that really isn't any real reason to keep using and particularly continuing to produce new i386 binaries. Obviously barely anybody even runs i386 hardware anymore so moving over to only producing x86_64 binaries is at worst no additional effort.
                    It makes sense in the Windows world. Developers use standard tools to create installers, and they produce 32-bit binaries, at least by default. It make sense because this kind of installer will run on Windows x86 (x86-32), Windows x64 (x86-64) and Windows on ARM (AArch64). Moreover, there are literally zero benefits in changing it.

                    Originally posted by L_A_G View Post
                    Seeing how you don't seem to understand or know this: Regular desktop Ubuntu is not supported on IBM Power/Z mainframe hardware! Not only that, desktop and server Ubuntu are not the same thing. Their intended uses and the stuff that they ship with are very clearly indicated by what they're called and before you try to do a "B-b-b-but cloud ubuntu!"-response that's really just Ubuntu server set up to run in a VM. The whole "cloud" thing really is just about making distributed systems scalable by spinning up additional VMs based on demand/need.
                    No matter what you call it, all the packages I mentioned here are supported on ppc64el and s390x.

                    Originally posted by L_A_G View Post
                    The thing is that they don't support the software you listed on IBM mainframe hardware. Canonical only supports their Cloud and Server distros on that hardware they're obviously used for
                    Again, all of the mentioned packages are supported on ppc64el and s390x. What is more, GNOME Shell, GNOME Games, Gnome To Do, Cheese, Rhythmbox, Totem, Shotwell, Simple Scan, Transmission Gtk+, BlueZ comes from the main repo, so they are supported directly by Canonical.
                    Could you explain me why GNOME Mines and Cheese are so important in the server system on mainframes? Because I really don't get it.



                    Originally posted by L_A_G View Post
                    You can't be this stupid, can you? Trying to actually act as if VirtualBox, VM software really not intended for hardware accelerated software, somehow being representative of virtual machines in general? This is a particularly stupid assertion to make when KVM with GPU passtrough don't just exist, they're well known.
                    I see that you can. First of all, you need a free GPU to pass through it to the virtual machine. Secondly, it may be extremely hard or even impossible to install new drivers on the old system. And the old ones certainly will not handle new hardware. Let's say you bought a game for Ubuntu 6.06 LTS. Try to install on this distro the newest NVIDIA drivers that will support GeForce GT 1030. Good luck!
                    And if this is not enough for you, think about games from Loki and RuneSoft. Good luck with new GPU drivers on kernel 2.0 or 2.2!

                    Originally posted by L_A_G View Post
                    Not only is that an appeal to authority fallacy, i.e almost as well known as a straw man, try to remember that he pushed quite heavily for dropping i286 support back in the day so the talk about having binary compatibility "preferably forever" rings kind of hollow.
                    Again, backward compatibility is important mainly for software, not hardware. No one uses 286 on desktop anymore, but some people may want to run applications that were written for Windows 3.11.

                    Comment


                    • Originally posted by the_scx View Post
                      Ubuntu developers have no impact on software developers who target the Windows platform.
                      It is like WINE developers could give up supporting DirectX. That would not change anything when it comes to game developers. And it certainly would not cause the rewriting of old titles to Vulkan. It would only hurt Linux users. We have exactly the same situation here when it comes to 32-bit PE32 executables.
                      This is not true. There more and more software developers targeting Windows who release a wrapped version wine for Linux for one reason Linux users talk more about games that support their platform and it results in higher sales of Windows and OS X versions of their games.

                      So you want to do the best in Windows targeting Linux is part of that.

                      Comment

                      Working...
                      X