Announcement

Collapse
No announcement yet.

Raspberry Pi GPU Driver Turns Out To Be Crap

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • KeyJ
    replied
    Originally posted by dcw803 View Post
    But what if a company (eg Broadcom) design a GPU where the GPU does nearly all the work (including GL), doesn't this necessarily imply that the ARM-side code is a thin shim?
    That's right. However, one doesn't simply build an OpenGL ES 2.0 GPU in hardware. For example, a large and important part of an OpenGL (ES) 2.0+ driver is the shader compiler that turns C-like high level shader code ("GLSL" code) into GPU machine code. In theory, it might be possible to do that in hardware, but in practice, that wouldn't make any sense whatsoever. Nobody ever built a compiler for a C-like high level language into silicon, and nobody ever will. There are simply no benefits for such an approach. Zero. None. So it's perfectly safe to assume that Broadcom's shader compiler is a piece of software. Problem here: The ARM-side userland doesn't contain the shader compiler. The shims simply pass the GLSL code over to the GPU side and have it compiled there. So there is definitely some non-trivial amount of meaningful driver code running on what Broadcom calls the "GPU" (which is actually more like a DSP). So, no, we don't have a full open source driver.

    Again, I'm not going to say that this is a bad design decision. You just shouldn't boast to have "fully functional, fully open source" drivers when the largest part of these drivers is a closed-source blob running on a DSP.

    Leave a comment:


  • dcw803
    replied
    RPi GPU driver: design decisions!

    Originally posted by makomk View Post
    For instance Liz had this to say in reply to Luc Verhaegen's entirely accurate comment:


    She's redefined the entire OpenGL driver - including the shader compiler! - to be "microcode" and is pretending it's equivalent to the trivial closed-source microcode in competing hardware which actually has fully open source OpenGL drivers.
    But what if a company (eg Broadcom) design a GPU where the GPU does nearly all the work (including GL), doesn't this necessarily imply that the ARM-side code is a thin shim?
    Eben Upton stated quite clearly that this was the case in one of his comments on theraspberrypi.org where he said:

    We happen to have a GPU which exposes a comparatively high level (GL-like) interface, such that many of our userland functions are message passing shims... These are design decisions on the part of the respective GPU teams, which have wide-ranging implications for the software and hardware structure of the devices which use the resulting cores. The VideoCore driver isn?t structured this way to pull the wool over your eyes, it?s structured this way because of a genuine judgment that this is the best structure given the resources we have on the chip, which includes a vector DSP to which we can offload much of the low-level register access.
    I'd really like some of the critics to engage with Eben's point, ideally without name calling. In this situation, it's clearly easier to release the ARM-side GPU driver, fewer IP issues for a start. This may explain why Broadcom were able to release this code at all, but that doesn't make it misleading to say that they've released the complete device driver!

    If you think it'll help, you can define the Broadcom GPU as a). a bizarre design or b). not a GPU at all, as some commenters seem to imply. I'm not sure how that helps the discussion?
    The GPU exists, the Raspberry Pi team choose to use it, now they've persuaded Broadcom to open up the admittedly thinner-than-you-might-expect GPU driver. Useful? yes! Perfect? no. Misleading? no!

    - Just my opinion; your mileage may vary.

    Leave a comment:


  • makomk
    replied
    Originally posted by brent View Post
    "Turns out to be crap"? It's more like it turned out to be exactly what the foundation said instead of being the unicorn people want to believe in. If you actually read and understand the announcement, it becomes quite clear what to expect from the source release. The vast majority of tech sites got it wrong.
    The Raspberry Pi Foundation themselves have been doing their best to confuse and mislead people into thinking that their source code release is more than it actually is, though. For instance Liz had this to say in reply to Luc Verhaegen's entirely accurate comment:

    No.

    There?s some microcode in the Videocore ? not something you should confuse with an ARM-side blob, which could actually prevent you from understanding or modifying anything that your computer does. That microcode is effectively firmware.
    She's redefined the entire OpenGL driver - including the shader compiler! - to be "microcode" and is pretending it's equivalent to the trivial closed-source microcode in competing hardware which actually has fully open source OpenGL drivers.

    Leave a comment:


  • AJSB
    replied
    I'm completely sold to it !!!

    I watched some of those videos and in one of CoDMW2 the player, in a scene with a fair amount of shooting, was getting spikes of 15fps minimum and usually between 20-30 with sometimes a bit more....

    But he was playing at native res. of 1366x768 ! I intend to play at 800x600 ! that is less than half of the pixels !!!

    That will boost FPS a lot reaching some very acceptable values.

    Leave a comment:


  • GreatEmerald
    replied
    Yes, you are correct on both accounts. And yes, the integrated HD 63XX GPUs use Evergreen chips, but at least those shouldn't be dropped in the near future, as AMD did a drop recently. And once that does happen, r600g should be good enough to handle most tasks.

    Yes, there are a few videos on YouTube showing those GPUs handling things like GTA. According to Game Debate, you should be able to play ET:QW on high settings using an HD 6320. On lowest settings, you should even be able to play CoD4:MW.

    Leave a comment:


  • AJSB
    replied
    Originally posted by chithanh View Post
    The 6310/6310 are the marketing names for the GPU in the E-350/E-450. It is basically the same as the 5450 minus Eyefinity support.

    Ah OK , it's all marketing , glad to be clarified....


    Well i checked the performance of it and no matter is clearly inferior to my current 9600GT 1GB DDR3 RAM graphic card (correct me if i'm wrong),

    i also believe that i'ts clearly superior to my previous "graphic card" that was Intel build in graphics in the motherboard chipset....i do not recall correctly at 100% , but before current MB, i had a Asus P5B MB , i think or something like that, that includes a P965 chipset and , AFAIK, that is clearly inferior to a HD 5450 (again, correct me if i'm wrong) and i played, IIRC, W:ET and even ET:QW with it for some time...

    So , a HD5450 might not be that bad to play (simpler ?) games and in fact, it even supports Dx11 where's my 9600GT only supported Dx10 (not that it matters for LINUX world) and OpenGL 4.0 that my 9600 doesn't support (dunno if my 9600 GY supports Open CL 1.1)....

    I confess that i always was a NVIDIA FanBoy (but far from a Intel FanBoy no matter i always used their CPUs as for PCs goes) but now...

    I'm impressed...

    Leave a comment:


  • chithanh
    replied
    The 6310/6310 are the marketing names for the GPU in the E-350/E-450. It is basically the same as the 5450 minus Eyefinity support.

    Leave a comment:


  • AJSB
    replied
    Originally posted by curaga View Post
    Much better linux experience than with the recent Atoms
    Do make sure to get the dual-core versions, e-240 is a single core.



    It's a hd5450, embedded into the cpu. Look for that card in reviews to see how it performs. Has xvba video accel using the blob.



    Very well.
    E-350 and E-450 are the ones to go then....

    But i just checked in the ASUS site, and they claim that in their mini-ITX boards the E-350 is HD6310 and E-450 is HD-6320

    Leave a comment:


  • curaga
    replied
    Originally posted by AJSB View Post
    So, what can i expect from AMD APUs like E-240, E-350 and E-450 ?
    Much better linux experience than with the recent Atoms
    Do make sure to get the dual-core versions, e-240 is a single core.

    Good enough for 1080p video playback w/o "hiccups". ?!?

    Good enough for simple games like Wolfenstein : Enemy Territory , Xonotic , Alien Arena ,etc. even if i have to lower the graphics resolution to 800x600 and cut the effects all the way down ?

    ...or do i need something with more "punch" ?!?


    I don't see specs about OpenGL of these APUs....only specs talking about Dx11...
    It's a hd5450, embedded into the cpu. Look for that card in reviews to see how it performs. Has xvba video accel using the blob.


    How well are these APUs supported by Linux Kernel and FOSS video drivers in special 3D graphics ?
    Very well.

    Leave a comment:


  • AJSB
    replied
    Thanks for the info and indeed, i check out the Net and there is a bunch of mini-ITX boards with AMD APUs with very interesting prices and specs as for I/O goes !


    So, what can i expect from AMD APUs like E-240, E-350 and E-450 ?

    Good enough for 1080p video playback w/o "hiccups". ?!?

    Good enough for simple games like Wolfenstein : Enemy Territory , Xonotic , Alien Arena ,etc. even if i have to lower the graphics resolution to 800x600 and cut the effects all the way down ?

    ...or do i need something with more "punch" ?!?


    I don't see specs about OpenGL of these APUs....only specs talking about Dx11...


    How well are these APUs supported by Linux Kernel and FOSS video drivers in special 3D graphics ?

    Leave a comment:

Working...
X