Announcement

Collapse
No announcement yet.

Fedora Making Strides On Enabling Greater AI Use, Easier AMD ROCm PyTorch Acceleration

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Fedora Making Strides On Enabling Greater AI Use, Easier AMD ROCm PyTorch Acceleration

    Phoronix: Fedora Making Strides On Enabling Greater AI Use, Easier AMD ROCm PyTorch Acceleration

    Christian Schaller of Red Hat shared an update on Friday around the ongoing enhancements to Fedora Workstation. Given the current industry trends, ongoing Fedora Workstation development is seeing a lot of attention around... AI, AI, AI...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    AI is and will screw gamers and other consumer GPU users for years to come, this is known. It'll infect other areas too, I wonder how badly. At the very least it's made tech news incredibly dull, if I wanted investor news I'd be elsewhere.

    Comment


    • #3
      Originally posted by geerge View Post
      AI is and will screw gamers and other consumer GPU users for years to come, this is known. It'll infect other areas too, I wonder how badly. At the very least it's made tech news incredibly dull, if I wanted investor news I'd be elsewhere.
      Yeah, it's really annoying. I can't stop hearing the term AI anymore.

      Comment


      • #4
        I run Fedora 40, and use AI. It's not hard to do on Fedora;

        1. Install Docker
        2. Install Ollama, Open-Webui, and Stable-Diffusion containers
        3. Profit.

        I'm not understanding the difficulty.

        Comment


        • #5
          Originally posted by Schaller's blog post
          Container Toolbx enables developers quick and easy access to their favorite development platforms
          Nah, but Distrobox does.

          Originally posted by Schaller's blog post
          Toolbx, our incredible developer focused containers tool, is going from strength to strength these days with the rewrite from the old shell scripts to Go starting to pay dividends. The next major feature that we are closing in on is full NVIDIA driver support with Toolbx.
          Congratulations, although it's very strange that Distrobox, being written in POSIX shell, already has ample documentation on how to integrate with NVIDIA. Well, maybe in another two years Toolbx can also catch up to having real users on real commercial hardware. Maybe even without a mafia campaign on Universal Blue distributions wanting to adopt Distrobox!! How is that going for you?

          Comment


          • #6
            As long as they don't inexorably tie the AI stuff into the core operating system, just make whatever form someone wants to use it an optional thing, then not as concerned. Windows and macOS on the other hand I am worried about. Been dabbling with Linux for a long time, more and more my default choice and finally starting to just do it, year of the Linux desktop and all of that

            Comment


            • #7
              Originally posted by ehansin View Post
              As long as they don't inexorably tie the AI stuff into the core operating system, just make whatever form someone wants to use it an optional thing, then not as concerned. Windows and macOS on the other hand I am worried about. Been dabbling with Linux for a long time, more and more my default choice and finally starting to just do it, year of the Linux desktop and all of that

              Understand that ever since you had the integration of cache memory much less 4 levels of it plus multi- core CPUs and GPUs plus heterogeneous memory schemes there has been a type of “AI” in the form of things like branch prediction and scheduling and more being facilitated in the kernel of all OS’s including Linux. Compared to Generative AI it’s primitive of course. But as the heterogeneity of systems and the increased complexity that brings at the component level and over racks and pods at scale increases so too will be the need for more advanced forms of AI to manage where the flow of data goes and how long it stays. In fact AI is now being deployed right now to develop the most effective layout of chip internals for any given task. I’m afraid Linux will not be immune to the inevitable march of AI being integrated into core kernel functions over time particularly since Linux is at the heart of not only HPC and nearly 100% of the worlds supercomputers many of which partly or exclusively do AI but also Linux is the heart of the Internet itself. It’s as if we are seeing the birth of a silicon hive mind before our eyes modeled after, at least thematically, the human mind with all its heterogeneous parts and complex neuronal networking.

              Or it could be the Matrix and if so then the movies are actually documentaries from the future and are therefore prophecies.
              Last edited by Jumbotron; 16 June 2024, 09:33 AM.

              Comment


              • #8
                Originally posted by chocolate View Post
                Nah, but Distrobox does.

                Congratulations, although it's very strange that Distrobox, being written in POSIX shell, already has ample documentation on how to integrate with NVIDIA. Well, maybe in another two years Toolbx can also catch up to having real users on real commercial hardware. Maybe even without a mafia campaign on Universal Blue distributions wanting to adopt Distrobox!! How is that going for you?
                Yeah, there are many reasons Distrobox > Toolbox. Just adding "--nvidia" to the create command vs setting up the RPM Fusion repo inside the container and installing the drivers again inside that container was certainly one of them. It's telling that basically every other container focused project has chosen Distrobox over Toolbox, including big names like the immutable SUSE offshoots. Distrobox is still one of my favorite projects from the last few years.

                Comment


                • #9
                  Originally posted by macemoneta View Post
                  I run Fedora 40, and use AI. It's not hard to do on Fedora;

                  1. Install Docker
                  2. Install Ollama, Open-Webui, and Stable-Diffusion containers
                  3. Profit.

                  I'm not understanding the difficulty.
                  Same here on CachyOS. All the dependencies for AMD, ROCm and whatnot, are in the repos. The only problem is that, well, I have an AMD GPU and not a lot of things that work for NVIDIA work for AMD or Intel. It is what it is and I'm not trying to complain about that today, I'm just glad that the fundamental bits and pieces are starting to be more and more readily available and that the things that do work seem to work pretty well. Hopefully the tooling being more readily available will translate over to more and more things supporting AMD and Intel.

                  Comment


                  • #10
                    Originally posted by macemoneta View Post
                    I run Fedora 40, and use AI. It's not hard to do on Fedora;

                    1. Install Docker
                    2. Install Ollama, Open-Webui, and Stable-Diffusion containers
                    3. Profit.

                    I'm not understanding the difficulty.
                    So... doing AI on Fedora is easy by using something else then Fedora (except the Kernel)? Easy as that...

                    Comment

                    Working...
                    X