Announcement

Collapse
No announcement yet.

NVIDIA Publicly Releases Its OpenCL Linux Drivers

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • smitty3268
    replied
    You seem to have completely missed Ranguvar's point. It doesn't really matter if you can run the OpenCL code on the CPU, if the CPU is always going to be 10x faster than any GPU implementation. You might as well just stick with your current CPU-bound code that you already know is working and efficient, instead of porting to a new platform that probably isn't going to be as efficient on a CPU anyway just because of it's GPU-centric nature and your extensive optimizations to the old codebase.

    Now I don't know if that 10x faster condition is really accurate or not... There was the Badaboom h.264 CUDA encoder that seemed to run quite fast, but the problem with it was the quality was horrible. Like on the level of Theora, so it's not really fair to compare it's speed against something like x264. The question then is if the poor quality was necessary to run fast on a GPU or if they've just got a bad implementation. I suspect both might be true.

    A good GPU isn't necessarily faster than a CPU in anything that is single-threaded. GPU's have hundreds of shader processors all running in parallel, but if it can only run one or two at a time it's going to get smoked by even an old CPU.

    Leave a comment:


  • BlackStar
    replied
    Originally posted by Ranguvar View Post
    Incorrect If you know how video encoding/decoding works, and how GPUs work, GPUs are _not_ CPUs. There are some things they do _very_ well, like the Fast Fourier Transform for example, that make them far better at decoding video. However, the whole GPU encoding mess is smoke and hand-waving.
    OpenCL is GPU/CPU agnostic. You can instruct a specific kernel to run on either, provided your drivers support this. Check the CL_DEVICE_TYPE_* enumeration.

    Thanks for the free lecture, though.

    No current GPU can come close to a good CPU running x264. The x264 devs have said that there are some small parts of the encoding process that could benefit from the GPU, but it isn't much. Unfortunately, most (all?) of the magazine and e-mag benchmarks have used a very bad CPU encoder like Apple's, an ancient build of x264, or very crazy settings.

    GPU decoding is what is what will really be nice -- not encoding, sadly. Unless you're talking about advanced post-processing before encoding, as the GPU can help a ton with FFT noise/grain removal.
    I distinctly remember a couple of CUDA-based x264 encoders being hyped around last spring / summer, with numbers like "40% faster than the faster quad core" being thrown around. I haven't really followed the news, but a quick google search reveals that the actual products don't live up to the hype.

    I still see potential in GPU-assisted encoding, however. Low-end machines will probably benefit more than high-end ones here (the latter can already handle x264 encoding in real time).

    Other than that, you need to remember that we are just at the beginning of the GPGPU era. This is more or less new ground: the necessary tools are only now becoming available (no, Brook and CUDA aren't not nearly good enough for wide-spread adaptation; Compute shaders and OpenCL on the other hand, are.)
    Last edited by BlackStar; 09-29-2009, 06:52 PM.

    Leave a comment:


  • Ranguvar
    replied
    Originally posted by BlackStar View Post
    Now, OpenCL-based encoding and we are talking.
    Incorrect If you know how video encoding/decoding works, and how GPUs work, GPUs are _not_ CPUs. There are some things they do _very_ well, like the Fast Fourier Transform for example, that make them far better at decoding video. However, the whole GPU encoding mess is smoke and hand-waving. No current GPU can come close to a good CPU running x264. The x264 devs have said that there are some small parts of the encoding process that could benefit from the GPU, but it isn't much. Unfortunately, most (all?) of the magazine and e-mag benchmarks have used a very bad CPU encoder like Apple's, an ancient build of x264, or very crazy settings.

    GPU decoding is what is what will really be nice -- not encoding, sadly. Unless you're talking about advanced post-processing before encoding, as the GPU can help a ton with FFT noise/grain removal.

    Leave a comment:


  • Ant P.
    replied
    Originally posted by Veerappan View Post
    I have no special knowledge of the matter, but I'm guessing that the answer is not really. The most that I could see the nouveau developers using these drivers for is as a reference platform for compatibility testing of the nouveau driver implementation. i.e. write a feature in Nouveau, see if it works using an opengl/opencl program. If it doesn't, see if the same program is broken in Nvidia's driver.

    I wouldn't be surprised if the EULA for the Nvidia drivers has a statement that you aren't allowed to use the drivers to reverse engineer Nvidia proprietary info, yadda, yadda, yadda..
    Hence the reason for my post on the first page. The EULA seems worded specifically to stonewall any effort by the nouveau devs to make a free OpenCL-capable driver.

    Leave a comment:


  • BlackStar
    replied
    First try: crash!
    Code:
    // Runs on AMD's implementation :/
    CL.CreateContextFromType((ContextProperties*)null, DeviceTypeFlags.DeviceTypeDefault, IntPtr.Zero, IntPtr.Zero, &error);
    Anyone see what's wrong with that code?

    Leave a comment:


  • Veerappan
    replied
    Originally posted by bartek View Post
    So, will this help the people working on the Nouveau open-source nvidia drivers?
    I have no special knowledge of the matter, but I'm guessing that the answer is not really. The most that I could see the nouveau developers using these drivers for is as a reference platform for compatibility testing of the nouveau driver implementation. i.e. write a feature in Nouveau, see if it works using an opengl/opencl program. If it doesn't, see if the same program is broken in Nvidia's driver.

    I wouldn't be surprised if the EULA for the Nvidia drivers has a statement that you aren't allowed to use the drivers to reverse engineer Nvidia proprietary info, yadda, yadda, yadda..

    Leave a comment:


  • bartek
    replied
    Nouveau nVidia drivers

    So, will this help the people working on the Nouveau open-source nvidia drivers?

    Leave a comment:


  • V!NCENT
    replied
    Originally posted by yogi_berra View Post
    And in two years AMD drivers will go on beer runs! Something to look forward to...
    "Oh V!NCENT! Can I please swap my nVidia card for your ATI card because mine isn't officialy supported anymore! Q_Q"

    Leave a comment:


  • _txf_
    replied
    Originally posted by mirza View Post
    Someone really smart could do h.264 decoder-on-steroids now.
    seeing as you need to be using a nvidia driver you might as well use vdpau. It wroks today and is much more efficient.

    What would be cool is if someone added h.264 acceleration to encoding with opencl.

    Leave a comment:


  • BlackStar
    replied
    Originally posted by mirza View Post
    Someone really smart could do h.264 decoder-on-steroids now.
    Unfortunately, you'd need an OpenCL-capable GPU for this - and OpenCL-capable GPUs tend to have dedicated (and faster!) video decoding blocks already. Given the shared buffers between OpenCL/OpenGL, this would probably be a viable approach for video decoding on Gallium drivers (unless there are plans to add a dedicated video decoding API/tracker - no idea).

    Now, OpenCL-based encoding and we are talking.

    Leave a comment:

Working...
X