Announcement

Collapse
No announcement yet.

Valve Will Not Be Officially Supporting Ubuntu 19.10+

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Aeder View Post
    Debian makes the most sense from a least effort for maximum results perspective.
    It doesn't really. A primary distro for Steam must be one that Just Works Out Of The Box(tm). Debian comes close to being a meta-distro targeted at those who want to tinker with it and control how precisely they want their OS to be set up to the last detail. That makes it unsuitable as a supported target for games developers. Arch won't do either for the same reason. In fact thinking about it, once we have also ruled out all the Ubuntu derivatives and also Fedora, which tends to be too bleeding edge and not reliable enough, the best choice really seems to come down to SUSE or Mandriva, surprising as it seems.

    Comment


    • Originally posted by jacob View Post
      It doesn't really. A primary distro for Steam must be one that Just Works Out Of The Box(tm). Debian comes close to being a meta-distro targeted at those who want to tinker with it and control how precisely they want their OS to be set up to the last detail. That makes it unsuitable as a supported target for games developers. Arch won't do either for the same reason. In fact thinking about it, once we have also ruled out all the Ubuntu derivatives and also Fedora, which tends to be too bleeding edge and not reliable enough, the best choice really seems to come down to SUSE or Mandriva, surprising as it seems.
      Why does it have to be a distribution. Steam from flatpak runs ubuntu 19.10, debian past stable with flatpak 1.2.4 from backports......

      Neutrally packaged works.

      Comment


      • Originally posted by oiaohm View Post

        Why does it have to be a distribution. Steam from flatpak runs ubuntu 19.10, debian past stable with flatpak 1.2.4 from backports......

        Neutrally packaged works.
        Neutral packaging is generally the way to go for 3rd party apps, but it doesn't work if the underlying distro doesn't support the architecture. If, say, Ubuntu's kernel doesn't even have the 32bit ABI any more, if it doesn't include support for running processes in compatibility mode, if the distro doesn't ship 32bit GPU driver interfaces etc., then it's not an option.

        Comment


        • Originally posted by oiaohm View Post

          1) 2038 problem. Linux kernel syscall and even the Windows 32 bit version of their API is infected with this. Best way to start addressing this problem is cease making 32 bit applications.
          The kernel has been patched for this at least one year ago in 32 bits. The biggest problem now is file formats, like ZIP which is widespread and infected by the issue. Now it is also up to the developers to ensure some kind of backward compatibility with older apps (Windows excels on this, despite it's much less the case since they dropped support for 16 bits apps), and while dropping 32 bits supports is fine dropping 32 bits libs is problematic in that area. Thus 32 bits is much less memory hungry than 64 bits is because of pointer sizing. This is the main reason why the (rather unused) x32 arch exists.

          Comment


          • Originally posted by jacob View Post

            It doesn't really. A primary distro for Steam must be one that Just Works Out Of The Box(tm). Debian comes close to being a meta-distro targeted at those who want to tinker with it and control how precisely they want their OS to be set up to the last detail. That makes it unsuitable as a supported target for games developers. Arch won't do either for the same reason. In fact thinking about it, once we have also ruled out all the Ubuntu derivatives and also Fedora, which tends to be too bleeding edge and not reliable enough, the best choice really seems to come down to SUSE or Mandriva, surprising as it seems.
            Have you ever installed Debian?

            Comment


            • Originally posted by DoMiNeLa10 View Post

              I think Arch can fill that gap, it's a perfect desktop distro, as rolling release is the only model that makes sense in such setups, and it has the best documentation out there as well. I think it's the most user-friendly distro.
              tumbleweed...?

              Comment


              • Originally posted by jacob View Post
                Neutral packaging is generally the way to go for 3rd party apps, but it doesn't work if the underlying distro doesn't support the architecture. If, say, Ubuntu's kernel doesn't even have the 32bit ABI any more, if it doesn't include support for running processes in compatibility mode, if the distro doesn't ship 32bit GPU driver interfaces etc., then it's not an option.
                I have checked 19.10 current planned kernel it still has 32bit ABI in syscalls. That all you need to run a containerised solution. You don't need the distribution to ship any 32 bit libraries because with flatpak and other full containerised solutions they provided all of their own. Yes flatpak queries host for what Nvidia closed source version is required then downloads its own matching that version. So if host only has 64 bit version of Nvidia the flatpak system can install 32 and 64 bit versions.

                This is not the final solution either if you don't care about overhead. You look at emugl and virtgl both can be run over a socket these allow you not to have 32 bit GPU drivers and instead used the 64bit drivers for 32 bit applications.

                32 bit x86 support is in the 64 bit x86 linux kernel as long as the builder of the kernel does not turn it off.

                The final solution when you have total arch miss match if you need to run something is use qemu usermode of course to use this you still need a full container of platform matched libraries.

                Originally posted by gojul View Post
                The kernel has been patched for this at least one year ago in 32 bits.
                This is kind of right but also wrong. The way it was fixed has some serous limitation

                This is wrong a year ago syscalls for using 64 bit time versions of functions from 32 bit were provided. Problem here is 32 bit time versions are still accessible and are not removable for legacy applications.

                How is the 32 bit time to be usable so applications can perform 32bit syscalls using 32bit time past 2038
                https://lwn.net/Articles/766089/

                Yep a time namespace in other words a containerthat changes UNIX Epoch time 1 January 1970 to a different value. This has all kind of horrible side effects if two applications that using IPC between each other be it network or local using time values and they are using a different Epoch Value.

                So by 2038 all 32 bit applications linux applications that have not been updated to use the 64 bit time version of functions will have to be in a container of some form. Might as well get use to this now. Also even if a application has been updated you have to be absolutely sure that it does not use a single library that has not been updated or in 2038 it will break. The simplest way it just build 64 bit.

                Originally posted by gojul View Post
                Thus 32 bits is much less memory hungry than 64 bits is because of pointer sizing.
                https://bryanquigley.com/posts/memor...-compared.html
                The answer not always. If it is more memory hungry you are talking less than 500megs.

                Originally posted by gojul View Post
                This is the main reason why the (rather unused) x32 arch exists.
                Horrible trap is is intel took lot of the gains of x32 arch and built into the gcc and llvm compiler smarts to use 32 bit pointers in 64bit mode when it makes sense. So fairly much nuking most of the benefit. 32 bit gives you a memory limit and bugger all memory savings.

                There is a lot of question since intel has improved the 64 bit compiler so much do we need x32 any more.

                There was a time when the compiler used 64 bit pointers ever where and did not optimise to 32 bit when it suited where there was some big gains from 32 bit mode but we don't live in those days any more.

                Comment


                • Originally posted by Djhg2000 View Post

                  There's a difference to having rules at a workplace and at a community project. Usually a community project will do just fine without having any formal rules. Those who end up constantly being at odds with the rest of a project are usually better off forking it.

                  Just look at MPlayer; there was almost constant forking because in many respects it was a very capable player with poor management over the codebase. Eventually one of the forks gained enough traction to survive and that's why we have MPV today. Can you imagine if there was a set of rules in place to prevent them from going off at each other? We'd probably be stuck with MPlayer in all of it's single threaded glory with a few random forks for stylized subtitle formats, because nobody would've seen the frustration and subsequent anger from those trying to fix it.

                  Sometimes people need to be offended for things to happen and sometimes you just can't be friends with everyone. It's human nature and taking that away ultimately paves the way for totalitarianism as it propagates throughout society.

                  Also there are plenty of stories floating around where written rules are being applied arbitrarily in a corporate setting to get rid of people. This is nothing new, abusing the rules are about as old as the rules themselves and having them written down just makes it easier to abuse the wording.



                  And how exactly does abusable rules make tech communities "more welcoming to anyone who isn't a white, straight man"? If you're on the internet and still give a crap about skin color, sexuality or gender, it doesn't make you progressive. It makes you a racist perverted sexist.

                  You can be a snake hanging out with llama for all I care, it doesn't affect what I think about your code.
                  Having the rules written down minimizes the risk of abuse because then you have something to base a defence against, which is something that is 100% impossible with unwritten rules.

                  If I ban you due to "section X" then you can mount a defence on why what you did, wrote or said does not break the rules of "section X". Now compare this with unwritten rules, there I just ban you and for whatever counterargument you might trow in my way I can always move the goalposts since there are no written rules to check.

                  The "plenty of stories floating around" are mostly from when the people accused have not bothered to refute the claims and sometimes from people trying to justify why they left without telling the real story so they blame it on HR.

                  So far in known history no totalitarian state have ever came from "people trying to be nice to each other" propagating throughout society, in fact it have as of yet always been the contrary. Having written codes of conduct is not implementing newspeak, people claiming that should try to actually read 1984 (it is a very good book).

                  Comment


                  • What people don't seem to understand is WHY Valve chose Ubuntu. It's ultra easy for new people, no password nags, super easy software install and never have to touch a terminal. Valve isn't going to choose your "elite hacker" distro okay.

                    For Valve it must be easy for NEW people. Understand that.

                    Comment


                    • Originally posted by Almindor View Post

                      I'm sure their balance sheet are laughing with them being in the red so far on the linux side...
                      For now, but in the long run there's nothing better.

                      Comment

                      Working...
                      X