Announcement

Collapse
No announcement yet.

There's A New Libre GPU Effort Building On RISC-V, Rust, LLVM & Vulkan

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by Weasel View Post
    Shh now oiaohm will come how Western Digital funds massively in RISC-V for clearly use-cases other than a stupid microcontroller.
    Hard disk controllers *are* "stupid microcontrollers" .... They at best do a little signal processing probably mostly in hardware and logging of errors and cache management.

    Comment


    • #22
      Originally posted by microcode View Post
      This is a similar setup to something that Esperanto Technologies said they had prototyped. There's really nothing about RISC-V that prevents you from using it as a GPU base ISA.

      What I think would be interesting is an architecture which is not completely like most GPUs, which would basically be a set of full system cores with some graphics functionality, in addition to non-graphics cores. If your process uses the graphics functionality, it traps and is scheduled on a graphics-optimized core. These cores would have special stuff like MMU views to do interpolation and texture swizzles and decoding, datatypes appropriate for graphics available to vector runs. You'd still want fixed functions for rasterization, fragment blending, etc., but maybe a middle ground has some value.
      Intel Larrabee general purpose graphics card GPGPU


      Comment


      • #23
        Originally posted by cb88 View Post
        Hard disk controllers *are* "stupid microcontrollers" .... They at best do a little signal processing probably mostly in hardware and logging of errors and cache management.
        If you had done any research, or Googled it, you'd know that they're not just doing hard disk controllers with RISC-V, but I won't spoil the surprise for you.

        Comment


        • #24
          Originally posted by c117152 View Post
          Quite the understatement considering one of the first, if not the first, risc-v implementations was nVidia's NV-RISCV as used in their GPUs following ~2016: https://riscv.org/wp-content/uploads...V_Story_V2.pdf https://riscv.org/wp-content/uploads...ijstermans.pdf
          NVIDIA is using it in an embedded controller, not the shader cores.

          Comment


          • #25
            Originally posted by microcode View Post

            If you had done any research, or Googled it, you'd know that they're not just doing hard disk controllers with RISC-V, but I won't spoil the surprise for you.
            I did... what they are doing still falls in the class of micro controller... They could have already been doing stuff like this with their current controllers.

            Comment


            • #26
              Originally posted by Michael View Post

              Their Falcon/RISC-V control processor AFAIK is just being used to bring-up the GPU itself and some system management type purposes but not actually doing any of the graphics/compute.

              https://www.phoronix.com/scan.php?pa...ext-Gen-Falcon
              I understand that's what folks around here are saying. And I did promise myself not to comment on something I just don't know much about... But those presentations I've linked are raising performance issues and have block hierarchy schematics with graphics functions on the cores... I REALLY don't see why a dumb microcontroller that just kicks starts the board and possibly regulates some power states would need 64bit, encryption and so on...

              I suppose I'm just reading something wrong.

              Originally posted by microcode View Post
              NVIDIA is using it in an embedded controller, not the shader cores.
              I'm pretty sure shader cores are CUDA cores nowadays. Anyhow, to recap, my thinking was that they're RISC-V since the slides from nVidia's presentation specified performance issues they felt NV-RISCV would solve including 64bit memory and such. I also believed they have RISCV microcontroller. Micheal & co. said it's just the microcontroller that's RISC-V. I admitted I'm not sure but I guess it's possible (even likely) their cores are something else but the microcontroller then be doing a whole lot more than we give it credit for otherwise I can't, for the life of me, understand why they'd raise performance concerns for a power-state governor.

              Anyhow, I won't continue with this discussion since I'm honestly not sure about any of it and there so many people saying otherwise that I'm probably wrong. Still, unless someone can explain those odd notes from the presentation or link more modern docs saying otherwise... Regardless, over & out

              Comment


              • #27
                Originally posted by c117152 View Post

                I understand that's what folks around here are saying. And I did promise myself not to comment on something I just don't know much about... But those presentations I've linked are raising performance issues and have block hierarchy schematics with graphics functions on the cores... I REALLY don't see why a dumb microcontroller that just kicks starts the board and possibly regulates some power states would need 64bit, encryption and so on...

                I suppose I'm just reading something wrong.



                I'm pretty sure shader cores are CUDA cores nowadays. Anyhow, to recap, my thinking was that they're RISC-V since the slides from nVidia's presentation specified performance issues they felt NV-RISCV would solve including 64bit memory and such. I also believed they have RISCV microcontroller. Micheal & co. said it's just the microcontroller that's RISC-V. I admitted I'm not sure but I guess it's possible (even likely) their cores are something else but the microcontroller then be doing a whole lot more than we give it credit for otherwise I can't, for the life of me, understand why they'd raise performance concerns for a power-state governor.

                Anyhow, I won't continue with this discussion since I'm honestly not sure about any of it and there so many people saying otherwise that I'm probably wrong. Still, unless someone can explain those odd notes from the presentation or link more modern docs saying otherwise... Regardless, over & out
                No, I think you're just vastly underestimating the duties of the embedded controller on a GPU, and missing the point of 64-bit addressing: the embedded controller uses the same memory as the rest of the GPU, and many desktop GPUs have well over 4GiB of memory these days, so the addresses are 64-bit (or at least, they're more than 32-bit).

                The embedded controllers (there are several) on a desktop-class GPU are responsible for a LOT of things. The cryptographic functionality is at least involved in validating the firmware (making sure it's an authentic, signed NVIDIA firmware blob), if not also setting up DRM decoding (IIRC this is a functionality of NVIDIA's GPUs).

                There are microcontrollers with very complex tasks all around you. My camera's lens has several megabytes of 64-bit MIPS code to drive the autofocus motors and validate/receive firmware updates (among other things), it includes what seems to be a complete copy of zlib and a robust collection of diagnostic commands (the camera communicates with the lens through something similar to UART/RS232), complete with help pages.

                P.S. "CUDA core" doesn't have any computer science or general meaning, it's just NVIDIA's marketing term for a shader core (sometimes consisting of multiple contexts/threads).

                Comment


                • #28
                  by that definition llvmpipe is libre gpu

                  Comment


                  • #29
                    I bet, if it could use OpenCL or the like, you could probably have a very fast coprocessor for HPC applications.

                    Comment


                    • #30
                      Originally posted by cb88 View Post
                      Hard disk controllers *are* "stupid microcontrollers" .... They at best do a little signal processing probably mostly in hardware and logging of errors and cache management.
                      Weston digitial is not after risc-v chips only for harddrives.



                      Weasel is a pure fool. Weston digitial can see the writing on the wall and have stated so in press released around Risc-v from end of 2017 to now. When you start talking about systems with petabytes of ram/solid state storage spinning discs usage could come limited.

                      Esperanto development Weston Digital has directly invested in has no practical usage in a harddrive/ssd drive. So this should have you asking where is Weston Digital going.

                      Comment

                      Working...
                      X