Announcement

Collapse
No announcement yet.

There's A New Libre GPU Effort Building On RISC-V, Rust, LLVM & Vulkan

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    I thought of this a while ago. I don't think that we will see any "widely-used" open-source GPU in the near future. There is just no man-power to create and maintain such things. Most projects start somewhere but fail along the line.

    I think there are just 2 options:
    1. like this project but not as a separate chip: just make a display output (i.e. DisplayPort) and run everything else on the CPU via LLVMpipe and vulkan-cpu. Sure the CPU needs many cores that way.
    or
    2. ask AMD or Intel to make one of their older GPUs open-source

    Comment


    • #12
      This inspires me to pursue the carreer of chip designer. i would like to work on utilizying a transport triggered architecture isa for both general purpouse cpus and more specialized systems . Current efforts are dar todo focused on risc

      Comment


      • #13
        Originally posted by c117152 View Post
        True. But that's most of the silicon logic anyhow considering the rest of the cores just do raw compute... no? I mean, there's a reason they insist on keeping the microcode closed and don't care about the rest being out there. I think?
        RISC-V CPUs in NVidia GPUs are mostly used for specialized tasks, as far as I know, e.g. video decoding/encoding coprocessors. I'm not sure, but they might also use them for command processors now. The closed-down nature of the firmware/microcode might be due to DRM, among other things.

        Anyway, it's completely wrong to think of a GPU just as a bunch of compute cores. That's usually what the marketing focuses on and what matters most for GPGPU applications, but there's much more to it. When it comes to graphics, think of a GPU more as a fixed-function device with some stages in the processing pipeline that are programmable. Indeed that is exactly what it looks like from the graphics APIs point of view, which closely reflect the hardware. Without those crucial fixed-function parts, throughput easily drops by an order of magnitude. That's why Larrabee failed, even though they did include some minimal hardware features for efficient rendering.

        That's also why Nvidia right now is performing so much better in many games: they have much more capable fixed-function blocks, especially on the frontend (i.e. geometry processing, rasterization).
        Last edited by brent; 28 September 2018, 05:44 AM.

        Comment


        • #14
          huh? a cpu based gpu... larrabee comes to mind and that was an utter failure

          Comment


          • #15
            Originally posted by brent View Post

            RISC-V CPUs in NVidia GPUs are mostly used for specialized tasks, as far as I know, e.g. video decoding/encoding coprocessors. I'm not sure, but they might also use them for command processors now. The closed-down nature of the firmware/microcode might be due to DRM, among other things.

            Anyway, it's completely wrong to think of a GPU just as a bunch of compute cores. That's usually what the marketing focuses on and what matters most for GPGPU applications, but there's much more to it. When it comes to graphics, think of a GPU more as a fixed-function device with some stages in the processing pipeline that are programmable. Indeed that is exactly what it looks like from the graphics APIs point of view, which closely reflect the hardware. Without those crucial fixed-function parts, throughput easily drops by an order of magnitude. That's why Larrabee failed, even though they did include some minimal hardware features for efficient rendering.

            That's also why Nvidia right now is performing so much better in many games: they have much more capable fixed-function blocks, especially on the frontend (i.e. geometry processing, rasterization).
            The way I understood it was that the graphics cores nowadays are just optimized RISC-V compute cores and there's all these separate pieces glued in and spread around (possibly literally in the silicon and as extensions in the ISA) for the video decode and whatnot that go through the controller. So, my impression was that, on the silicon, it was all raw RISC-V processing for the most part unless it's something special right up until the recent generation where that denoise ai for ray-tracing was added. e.g., That 2nd doc I've linked says inverse quant transform (h264/HVEC feature) is being done through the controller... So, my impression was that it's all mostly just regular silicon with few exception thrown over the controller like in this case.

            Now you're telling me it's more CISCy with all these specialized logic blocks all over... I don't know. Last I looked through nVidia's patent portfolio I haven't seen too many unique designs and those are copyrightable... Meh. I guess I'm just not up to date enough to opinionate on the subject

            Comment


            • #16
              Originally posted by c117152 View Post

              True. But that's most of the silicon logic anyhow considering the rest of the cores just do raw compute... no? I mean, there's a reason they insist on keeping the microcode closed and don't care about the rest being out there. I think?
              Their Falcon/RISC-V control processor AFAIK is just being used to bring-up the GPU itself and some system management type purposes but not actually doing any of the graphics/compute.

              Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
              Michael Larabel
              https://www.michaellarabel.com/

              Comment


              • #17
                Damn, both RISC-V and Rust in the same title. Dead on announcement.

                Comment


                • #18
                  Originally posted by Michael View Post

                  Their Falcon/RISC-V control processor AFAIK is just being used to bring-up the GPU itself and some system management type purposes but not actually doing any of the graphics/compute.

                  https://www.phoronix.com/scan.php?pa...ext-Gen-Falcon
                  Probably the drivers don't even interact with it at all except perhaps for power management type stuff, or negotiating trust of certain things, it's basically doing the same job that the embedded controller does in modern CPUs (ARM Trustzone on AMD / Arc or Sparc others on Intel). It's litereally just an isolated little microcontroller in there that they can program to do menial tasks.

                  Comment


                  • #19
                    Originally posted by cb88 View Post

                    Probably the drivers don't even interact with it at all except perhaps for power management type stuff, or negotiating trust of certain things, it's basically doing the same job that the embedded controller does in modern CPUs (ARM Trustzone on AMD / Arc or Sparc others on Intel). It's litereally just an isolated little microcontroller in there that they can program to do menial tasks.
                    Yep right that's my expectation of it as well, far from RISC-V NV being anything actually doing any graphics/compute work.
                    Michael Larabel
                    https://www.michaellarabel.com/

                    Comment


                    • #20
                      Originally posted by cb88 View Post
                      Probably the drivers don't even interact with it at all except perhaps for power management type stuff, or negotiating trust of certain things, it's basically doing the same job that the embedded controller does in modern CPUs (ARM Trustzone on AMD / Arc or Sparc others on Intel). It's litereally just an isolated little microcontroller in there that they can program to do menial tasks.
                      Shh now oiaohm will come how Western Digital funds massively in RISC-V for clearly use-cases other than a stupid microcontroller.

                      Comment

                      Working...
                      X