Announcement

Collapse
No announcement yet.

Intel's OpenCL Beignet Project Is Gaining Ground

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • phoronix
    started a topic Intel's OpenCL Beignet Project Is Gaining Ground

    Intel's OpenCL Beignet Project Is Gaining Ground

    Phoronix: Intel's OpenCL Beignet Project Is Gaining Ground

    Beignet is the controversial project to provide OpenCL/GPGPU support for modern Intel GPUs on Linux. Since the first Beignet release in April, this open-source Intel OpenCL project has been making lots of progress...

    http://www.phoronix.com/vr.php?view=MTQzOTI

  • duby229
    replied
    Originally posted by smitty3268 View Post
    For example, there's the on screen HUD that Marek wrote a while back to display stats on the screen.
    Which is something I use all the time.

    Leave a comment:


  • mrugiero
    replied
    Originally posted by duby229 View Post
    I'm not going to and I've explained why. Intel doesnt have any obligation to do anything other than what they think is best for themselves. And what they've done so far has mostly benefited everyone. I can't fault them for their commitment so I won't.
    OK, but be aware that I'm not asking "why should Intel change their mind", I'm interested in knowing, just that.

    Leave a comment:


  • duby229
    replied
    Originally posted by mrugiero View Post
    I'm not discussing anything of that, so I don't know why you keep answering me on that subject; I thought my previous post clarified that. I just want to know about the other benefits using Gallium might bring, aside from the shared code itself. I'm not trying to deny Intel directly tries to not share code.
    I'm not going to and I've explained why. Intel doesnt have any obligation to do anything other than what they think is best for themselves. And what they've done so far has mostly benefited everyone. I can't fault them for their commitment so I won't.

    Leave a comment:


  • mrugiero
    replied
    Originally posted by duby229 View Post
    You can take that as two examples of code sharing that Intel has chosen not to participate in. Don't misunderstand me, Intel has every right to want their OSS drivers to work with their OSS solutions. I'm fine with that. Plus they do contribute a lot of code to a lot of projects. Nobody can really fault Intel for their OSS commitment.

    I do feel that there is an argument to be made for Intel to port their OSS driver to gallium due to the potential it would have on improving the whole stack. But that is really selfish of me to want.
    I'm not discussing anything of that, so I don't know why you keep answering me on that subject; I thought my previous post clarified that. I just want to know about the other benefits using Gallium might bring, aside from the shared code itself. I'm not trying to deny Intel directly tries to not share code.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by mrugiero View Post
    EDIT: Anyway, I want to know of the other reasons to use Gallium you thought about.
    There are various cool tech things built on top of Gallium that all gallium drivers can take advantage of, but Intel drivers can not.

    For example, there's the on screen HUD that Marek wrote a while back to display stats on the screen. Or there's the Direct3D9 backend. It's unlikely Intel will ever create something like that for their own driver because of legal reasons, but they could have taken advantage of the free community work. Instead, they'll be stuck with the same D3D -> OGL wine translation that the binary drivers need.

    Further, if the intel drivers merged into gallium there is a fair amount of cleanups that could be done to the rest of the Mesa codebase. Given that Intel is essentially the only classic driver left, or at least the only modern one that is running shaders.

    Leave a comment:


  • duby229
    replied
    Originally posted by mrugiero View Post
    I'm aware of how it does speed up development. My point is, it doesn't inherently lead to better performance, and that's what I was correcting on the quote. It usually leads to better performance because of the faster development, caused, as you said, because of the shared code. I already stated in a previous code such facts about the use of Gallium and why I think they avoid it (since I'm not an Intel developer/executive, I can't do much more than speculating about it, but my guess is they don't want to benefit their competitors through shared code, even if that means more work for them).

    EDIT: Anyway, I want to know of the other reasons to use Gallium you thought about.
    You can take that as two examples of code sharing that Intel has chosen not to participate in. Don't misunderstand me, Intel has every right to want their OSS drivers to work with their OSS solutions. I'm fine with that. Plus they do contribute a lot of code to a lot of projects. Nobody can really fault Intel for their OSS commitment.

    I do feel that there is an argument to be made for Intel to port their OSS driver to gallium due to the potential it would have on improving the whole stack. But that is really selfish of me to want.

    Leave a comment:


  • mrugiero
    replied
    Originally posted by duby229 View Post
    The reason that gallium speeds up development is because it allows a lot more code sharing. Intel could use the existing VDPAU state tracker instead if writing a VA-API state tracker and save time. But that is also exactly the same reason they don't want to use gallium.

    EDIT: or more pertinent to this thread, they could use clover instead of writing beigenet.
    I'm aware of how it does speed up development. My point is, it doesn't inherently lead to better performance, and that's what I was correcting on the quote. It usually leads to better performance because of the faster development, caused, as you said, because of the shared code. I already stated in a previous code such facts about the use of Gallium and why I think they avoid it (since I'm not an Intel developer/executive, I can't do much more than speculating about it, but my guess is they don't want to benefit their competitors through shared code, even if that means more work for them).

    EDIT: Anyway, I want to know of the other reasons to use Gallium you thought about.

    Leave a comment:


  • duby229
    replied
    Originally posted by mrugiero View Post
    That was never the point of Gallium, but just a possible side-effect. The point of using Gallium is to speed up development. Better performance *might* happen because of this (you spend less time reinventing the wheel, and more time optimizing your code), while the theoretical maximum is probably by using specific code for specific drivers, just that it would take forever.
    The reason that gallium speeds up development is because it allows a lot more code sharing. Intel could use the existing VDPAU state tracker instead if writing a VA-API state tracker and save time. But that is also exactly the same reason they don't want to use gallium.

    EDIT: or more pertinent to this thread, they could use clover instead of writing beigenet.
    Last edited by duby229; 08-19-2013, 05:22 PM.

    Leave a comment:


  • uid313
    replied
    Originally posted by archibald View Post
    They don't agree that it's the best way to write their drivers. If you look for posts by Kayden on here he's gone into detail about it.
    Then maybe he should have proposed how to fix Gallium3D or propose something better than Gallium3D.

    I think unified graphics architecture is a good idea.
    Windows have Windows Display Driver Model (WDDM).

    Leave a comment:

Working...
X