Announcement

Collapse
No announcement yet.

Steam For Linux Beta Adds Experimental Namespaces/Containers Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by starshipeleven View Post
    With a container you are bare metal, and yes you can use optimized versions that are faster.

    That's why Clear Linux has docker images https://hub.docker.com/u/clearlinux
    with bare metal I mean "system native" - I know that stuff running in a virtual environment like docker is bare metal because of the way how virtualisation is handled on cpu side. How would you distinguish between both implementations?

    p.s.:for that very reason: on my nas system running Debian Stretch I'm using ClearLinux Containers (Mariadb and redis).

    Comment


    • #42
      Originally posted by CochainComplex View Post
      with bare metal I mean "system native" - I know that stuff running in a virtual environment like docker is bare metal because of the way how virtualisation is handled on cpu side. How would you distinguish between both implementations?
      Containers are using namespaces (a Linux kernel feature), not hardware virtualization (a hardware feature).

      It's not "bare metal virtualization" as there is no VM, no additional kernel, and no hardware virtualization layer to fool such kernel. The proper term is OS-level virtualization https://en.wikipedia.org/wiki/OS-level_virtualization

      It's literally just telling the current system's kernel "load libraries for this application from path X instead than from default system path" and "for this application please block system calls in this blacklist" but the application is talking directly to the same OS kernel as if it was any other application running in the system.

      Docker (Flatpak/Snap/LXC/whatever) does not need VT-x/AMD-Vi on Linux

      It needs VT-x on Windows (and Mac I guess) because there Docker is run through a Linux VM and to run the Linux VM you need virtualization because it's a VM.
      And why you need a VM? Because the docker image is not a VM but just a pack of Linux application and libraries, it needs a host Linux kernel to run.

      https://www.unixtutorial.org/does-do...virtualization
      Last edited by starshipeleven; 11-12-2019, 05:56 AM.

      Comment


      • #43
        Originally posted by zxy_thf View Post
        I'm Okay with a containerized steam with nested containerized games.
        The native steam causes lots of troubles on my system and the flatpak one works flawlessly, and to containerize individual games also makes sense to me because there are likely fewer compatibility issues (just think of how much effort MS put in order to maintain backward compatibility).

        My real concern is it seems Value created yet another containerization solution, and we already have (at least) three.
        No, Valve have not created another containerization solution, they are using existing technologies, as emphasized by "stormcrow".
        They are adapting the technologies to suit their needs, but they are not pushing for a new container format...

        Comment


        • #44
          Originally posted by starshipeleven View Post
          Containers are using namespaces (a Linux kernel feature), not hardware virtualization (a hardware feature).

          It's not "bare metal virtualization" as there is no VM, no additional kernel, and no hardware virtualization layer to fool such kernel. The proper term is OS-level virtualization https://en.wikipedia.org/wiki/OS-level_virtualization

          It's literally just telling the current system's kernel "load libraries for this application from path X instead than from default system path" and "for this application please block system calls in this blacklist" but the application is talking directly to the same OS kernel as if it was any other application running in the system.

          Docker (Flatpak/Snap/LXC/whatever) does not need VT-x/AMD-Vi on Linux

          It needs VT-x on Windows (and Mac I guess) because there Docker is run through a Linux VM and to run the Linux VM you need virtualization because it's a VM.
          And why you need a VM? Because the docker image is not a VM but just a pack of Linux application and libraries, it needs a host Linux kernel to run.

          https://www.unixtutorial.org/does-do...virtualization
          Thx for clearing some points - But I was only speaking about docker on linux where I thought that hardware virtualization techniques are used in the kernel to enforce separation of the container by involving "hardwired" virtualization mechanics. As a kind of protected space with native like ABI and API. Therefore my further conclusion was that this wrapping will lead to some minor performance impact if the container has to communicate with the "host" resources which are not within this virtual sandbox.

          Comment


          • #45
            Originally posted by xeekei View Post
            How will containerised games affect streaming/recording with other software, such as OBS?
            They should appear fine because ultimately the game has to be communicating with the display system (whether that be XOrg or Wayland based) and the graphics driver. So long as the streaming software has the required permissions to interface with the display system or graphics driver it will be able to get the game footage the same as always.

            Wayland probably has specific Wayland protocols to allow desktop-sharing and screen-recording software to grab all the rendered (AKA composited) frames that are ultimately displayed on the monitor.

            Comment


            • #46
              Originally posted by starshipeleven View Post
              Having 64bit steam client is pointless, you still need 32bit libs.
              Tell that to MacOS.

              Comment


              • #47
                Originally posted by schmidtbag View Post
                Tell that to MacOS.
                Tell what? MacOS itself can't run 32bit applications anymore from Catalina onwards.

                That's exactly when Steam client will become 64bit on other platforms too, when running 32bit games will not be possible anymore.

                Comment


                • #48
                  Originally posted by starshipeleven View Post
                  Tell what? MacOS itself can't run 32bit applications anymore from Catalina onwards.
                  That's my point... MacOS is now 64-bit only. You said having a 64-bit client is pointless. It is not. The need for 32-bit libs doesn't change that, and, there arguably isn't even a need for them at all. I'm seeing several games in my library getting ported to 64-bit because they no longer work in MacOS. I assume that means it doesn't have 32-bit libs.

                  Remember, when Steam for Linux was released, there were only a handful of available titles, most of which either already had 64-bit binaries or could have effortlessly made them before the debut. Had the client itself been 64-bit, there wouldn't have been any incentive for games to be ported with 32-bit binaries. Therefore, there wouldn't have been a need for 32-bit libs, and, Besset could've been working on something more practical than the namespace containers.

                  So, had the Linux Steam client been 64-bit from the very beginning, there wouldn't have been any incentive to make 32-bit ports, and this whole mess could've been avoided.

                  Comment


                  • #49
                    Originally posted by schmidtbag View Post
                    The need for 32-bit libs doesn't change that, and, there arguably isn't even a need for them at all.


                    Why I need to spell everything out in large letters. I'm sure you will still not understand, but explaining once or twice so that others reading this will at least have a chance of understanding is still my duty. SO here we go

                    The client does not care for shit about the system bit size. You could make the client run on ARM too in a week at best if you have the source because 99% of the architecture-specific code is the webkit rendering engine that comes from upstream and has an ARM veresion already.

                    For the steam client 32bit, 64bit, ARM, Power is irrelevant as it is a web application itself.

                    I'm seeing several games in my library getting ported to 64-bit because they no longer work in MacOS.
                    And many that never will be updated and stop working.

                    But I guess this is OK for Apple users. And you will probably tell me that you agree with arbitrarily dropping support, because that's what good Apple users do.

                    I assume that means it doesn't have 32-bit libs.
                    It means the OS won't run 32bit applications at all, even if you ship your own 32bit libs. The kernel interface for 32bit applications was disabled or removed.

                    Had the client itself been 64-bit, there wouldn't have been any incentive for games to be ported with 32-bit binaries.
                    Bullshit. What dictates the game bitness is the runtime for the games. They specifically made it to support 32bit and 64bit.
                    let me repeat
                    Steam shipped game runtime libraries for both 32bit and 64bit games. The "game runtime libraries" are for games.
                    Steam specifically wanted to let developers port their games to Linux at the lowest cost possible regardless of your own uninformed opinion of what is "easy to port" or not.
                    It was not a choice dictated by the Steam client own bitness.


                    Therefore, there wouldn't have been a need for 32-bit libs,
                    They are needed if you want to play older windows games that are repacked through Proton/Wine.
                    And this is A LOT of titles and probably the only way to play these games in the future when will stop working on Windows 12 because something has changed.

                    and, Besset could've been working on something more practical than the namespace containers.


                    Namespace containers are still required for the same reason flatpak 64bit applications exist. Closed source stuff will never get updated past a certain point and therefore you need to use a static environment and libraries for them or they will break when the world around them changes.

                    Currently Steam is shipping an ancient Ubuntu 16.04 "runtime" in its folders, but eventually they want to actually have more modern libraries than that for games.

                    32bit or 64bit is irrelevant. You need containers anyway because you are shipping closed source applications that will not be updated anymore after a few years. With proper containers you only need kernel support for 32bit applications, not the whole "multilib" 32bit system libraries.

                    So, had the Linux Steam client been 64-bit from the very beginning, there wouldn't have been any incentive to make 32-bit ports, and this whole mess could've been avoided.
                    Yeah, Flatpak and snaps were created only for 32bit applications because 64bit ones are magically always forward-compatible with any system library and don't need any of that.
                    Last edited by starshipeleven; 11-12-2019, 10:01 PM.

                    Comment


                    • #50
                      Originally posted by carewolf View Post
                      For any well-written program it is a simple recompile.
                      Sorry, I guess I was unclear. I'm not arguing with that in the abstract, and I do agree that it's often the case for SOME code, especially if that code is fairly new to begin with.
                      But *in practice* it absolutely is not the case with any "legacy" code I've ever seen, and it wasn't when we moved from 16 to 32 either.

                      > If you are doing something overly clever with pointers and packing, and you are an amateur and did it wrong, it might need some rework

                      Yeah, no. If you are doing something overly clever with pointers and packing, because that's what you needed to do to get something to work performantly, then you are a professional and did exactly what you should have. It's great to be able to sit on the sidelines with toy apps that are simple on a technical level (regardless of complex they are in a business sense) and sneer about how all *your* code poops rainbows, but when significant chunks of your codebase are hooked into asm and were written over a decade ago, it's a LOT of work.
                      The goal isn't "get it to compile on 64-bit", it's "get it to run properly on 64-bit". That's not the same thing, and I know you understand that. But if you think "just recompile it" is all that you need, then trust me: you have led a charmed life and have never had to inherit a complex system.

                      > And for the love of God.. The steam client and all the steam games are entirely separate things. The games are already in 64-bit. We are talking about the launcher and nothing else.

                      Yes, I'm aware. My point was that the steam client is *an ancient pile of garbage code*, and while it's not going to be anything like as hard to port as a game, I used the games themselves as an example of how much work that porting can be in an extreme case, because your ridiculous "just recompile it" comment trivialised what the other end of that spectrum can be, especially when you have 32-bit libs as dependencies because your code is a trainwreck (to be clear, "you" here is Valve of course, not carewolf :P) that has half a web browser and a dozen other random messes duct-taped inside it.

                      But I think we're getting too much into the technical weeds here, and losing the "real" point: which is that a 64-bit steam client adds absolutely no value at all, while increasing the maintenance burden for Valve since they can't just kill off the 32-bit version. It's a clear net negative, and as a result it's not something they should be doing NOW. In the future, sure, and if I was working on it I'd have been preparing for that migration for years already (and they probably have been to at least SOME extent ever since the W7 install base passed 50% of users on x64).

                      But, I repeat, until the number of 32-bit Linux USERS drops to well under say 10% and the number of Linux users gets well over the current "barely worth bothering with AT ALL" 1%, why would you burn ANY manpower at all on a project that has literally no actual benefit to anyone?

                      Comment

                      Working...
                      X