Originally posted by Weasel
View Post
Announcement
Collapse
No announcement yet.
There's A New Libre GPU Effort Building On RISC-V, Rust, LLVM & Vulkan
Collapse
X
-
Originally posted by microcode View PostThis is a similar setup to something that Esperanto Technologies said they had prototyped. There's really nothing about RISC-V that prevents you from using it as a GPU base ISA.
What I think would be interesting is an architecture which is not completely like most GPUs, which would basically be a set of full system cores with some graphics functionality, in addition to non-graphics cores. If your process uses the graphics functionality, it traps and is scheduled on a graphics-optimized core. These cores would have special stuff like MMU views to do interpolation and texture swizzles and decoding, datatypes appropriate for graphics available to vector runs. You'd still want fixed functions for rasterization, fragment blending, etc., but maybe a middle ground has some value.
- Likes 1
Comment
-
Originally posted by cb88 View PostHard disk controllers *are* "stupid microcontrollers" .... They at best do a little signal processing probably mostly in hardware and logging of errors and cache management.
Comment
-
Originally posted by c117152 View PostQuite the understatement considering one of the first, if not the first, risc-v implementations was nVidia's NV-RISCV as used in their GPUs following ~2016: https://riscv.org/wp-content/uploads...V_Story_V2.pdf https://riscv.org/wp-content/uploads...ijstermans.pdf
Comment
-
Originally posted by microcode View Post
If you had done any research, or Googled it, you'd know that they're not just doing hard disk controllers with RISC-V, but I won't spoil the surprise for you.
Comment
-
Originally posted by Michael View Post
Their Falcon/RISC-V control processor AFAIK is just being used to bring-up the GPU itself and some system management type purposes but not actually doing any of the graphics/compute.
https://www.phoronix.com/scan.php?pa...ext-Gen-Falcon
I suppose I'm just reading something wrong.
Originally posted by microcode View PostNVIDIA is using it in an embedded controller, not the shader cores.
Anyhow, I won't continue with this discussion since I'm honestly not sure about any of it and there so many people saying otherwise that I'm probably wrong. Still, unless someone can explain those odd notes from the presentation or link more modern docs saying otherwise... Regardless, over & out
Comment
-
Originally posted by c117152 View Post
I understand that's what folks around here are saying. And I did promise myself not to comment on something I just don't know much about... But those presentations I've linked are raising performance issues and have block hierarchy schematics with graphics functions on the cores... I REALLY don't see why a dumb microcontroller that just kicks starts the board and possibly regulates some power states would need 64bit, encryption and so on...
I suppose I'm just reading something wrong.
I'm pretty sure shader cores are CUDA cores nowadays. Anyhow, to recap, my thinking was that they're RISC-V since the slides from nVidia's presentation specified performance issues they felt NV-RISCV would solve including 64bit memory and such. I also believed they have RISCV microcontroller. Micheal & co. said it's just the microcontroller that's RISC-V. I admitted I'm not sure but I guess it's possible (even likely) their cores are something else but the microcontroller then be doing a whole lot more than we give it credit for otherwise I can't, for the life of me, understand why they'd raise performance concerns for a power-state governor.
Anyhow, I won't continue with this discussion since I'm honestly not sure about any of it and there so many people saying otherwise that I'm probably wrong. Still, unless someone can explain those odd notes from the presentation or link more modern docs saying otherwise... Regardless, over & out
The embedded controllers (there are several) on a desktop-class GPU are responsible for a LOT of things. The cryptographic functionality is at least involved in validating the firmware (making sure it's an authentic, signed NVIDIA firmware blob), if not also setting up DRM decoding (IIRC this is a functionality of NVIDIA's GPUs).
There are microcontrollers with very complex tasks all around you. My camera's lens has several megabytes of 64-bit MIPS code to drive the autofocus motors and validate/receive firmware updates (among other things), it includes what seems to be a complete copy of zlib and a robust collection of diagnostic commands (the camera communicates with the lens through something similar to UART/RS232), complete with help pages.
P.S. "CUDA core" doesn't have any computer science or general meaning, it's just NVIDIA's marketing term for a shader core (sometimes consisting of multiple contexts/threads).
Comment
-
Originally posted by cb88 View PostHard disk controllers *are* "stupid microcontrollers" .... They at best do a little signal processing probably mostly in hardware and logging of errors and cache management.
Weasel is a pure fool. Weston digitial can see the writing on the wall and have stated so in press released around Risc-v from end of 2017 to now. When you start talking about systems with petabytes of ram/solid state storage spinning discs usage could come limited.
Esperanto development Weston Digital has directly invested in has no practical usage in a harddrive/ssd drive. So this should have you asking where is Weston Digital going.
Comment
Comment