Announcement

Collapse
No announcement yet.

OpenCL 1.1 Specification Released

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • bridgman
    replied
    That's more like it.

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by bridgman View Post
    Yeah, like that...

    ... although I have more hair.
    Yeah like this much, right?:

    Leave a comment:


  • bridgman
    replied
    I'm not 100% sure myself. All the interesting questions seem to come up on the weekends when I can't wander over and pick brains

    Leave a comment:


  • curaga
    replied
    Oh, ok. I misunderstood that as having the UVD interfaces be implemented in shaders for that card.

    Leave a comment:


  • bridgman
    replied
    The shader code is *only* post processing AFAIK, I don't think we decode on shaders in the proprietary stack today - it's either CPU (r600) or UVD (everything else).

    My recollection is that the code starts in a high level shader language (probably HLSL) but then is hand-tweeked in some places.

    Leave a comment:


  • curaga
    replied
    Bridgman, I think that would make a lot of sense. If AMD opened up the shader code for R600 (feel free to remove post-processing), it would be another big sweep of good FOSS PR.

    Surely it would also be rather fast for a qualified dev to remove any secret post-processing; sure, there's bound to be a legal review after that, but for shader code it should be lighter than for actual specs.
    Much faster than writing one from ground-up, to be sure

    Could you tell whether it's in a standard spec (GLSL, OpenCL...) or in something ATI-specific? Even if it only ran on R600+ gpus, it would make a great headline, wink wink.

    Leave a comment:


  • chithanh
    replied
    Originally posted by bridgman View Post
    I haven't actually seen any shader-assisted H.264 decode implementations in public yet, have you ?
    There is also a shader based implementation for the Xbox 360's Xenos gpu.

    Leave a comment:


  • bridgman
    replied
    Anyways, bottom line here is that running an r600 against a UVD-based GPU (say an rv670 to keep everything else reasonably close) on Windows *might* be an interesting way to see whether UVD actually contributes to video quality the way that some people are suggesting.

    Leave a comment:


  • bridgman
    replied
    s/gogogogogo/nononono/

    There was a discussion going on about whether hardware-based decoding (eg UVD) produced higher quality than CPU-based decoding. My position was that the quality would generally be the same, and that it was processing further downstream (but before Xv/GL) that made the difference.

    Put differently, I was saying that the apparent quality difference between UVD decode and CPU decode was that the proprietary drivers, which typically had the serious post-processing, also used hardware decode since it was available to the developers.

    The post processing is considered "secret sauce" and it's highly unlikely we would open up that code. On the other hand, implementing it just requires video knowledge not any more hardware knowledge than we have already released, so there's no reason something similar could not be implemented in the open drivers.

    Leave a comment:


  • bridgman
    replied
    Yeah, like that...

    ... although I have more hair.

    Leave a comment:

Working...
X