Announcement

Collapse
No announcement yet.

AMD Releases OpenCL ATI GPU Support For Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    I don't think so. IIRC the HD4200 picks up display and UVD improvements from the HD4xxx family but the 3D engine is from the HD3xxx generation.
    Test signature

    Comment


    • #12
      Originally posted by RealNC View Post
      Reading the released specs of NVidia's G300, there is "native support for execution of C++ on GPU." That means running "real C++ applications" on the GPU.
      lol

      Does ATI allow that too now with OpenCL or is this something entirely different?
      Yes, it's something different entirely. Something called vapourware.

      Comment


      • #13
        I can't make sense of that video. NVidia announced C++ support. You claim this won't happen? Sounds a bit strange to me, so I suppose this video is by people who spread too much FUD or something.

        Comment


        • #14
          It will happen, but it might happen a little or a lot late. Basically, yes, nvidia designed the cards, and yes, they will run C++; what they're alleging is that they're having a ton of trouble trying to actually produce them, and are trying to cover that up and failing at it.

          Comment


          • #15
            Originally posted by BlackStar View Post
            lol

            Yes, it's something different entirely. Something called vapourware.
            Nvidia fakes Fermi boards at GPU Technology Conference
            WHAT DO YOU DO when you have a major conference planned to introduce a card, but you don’t have a card? You fake it, and Nvidia did just that. Updated 3x


            I looked at the presentation from NV's website, and I really question the "proof" that Fermi exists. It looks very much like a pre rendered animation they are playing, where they have dropped some of the frames in "the slow one".

            Update 3 in the article backs up the theory, that those just were pre rendered and mock ups as well

            Comment


            • #16
              Back to the original question, the C++ support NVidia is claiming is like a C++ version of CUDA. It really doesn't have anything to do with OpenCL support.

              And given that they had to make hardware changes in their upcoming hardware to do it, and the fact that AMD hasn't announced something similar in response, plus the fact that NVidia seems to be focusing more on those types of features than AMD is makes me assume that it's not in the R800 cards.

              To be honest, I'm not entirely sure that's a very big deal. It seems to me like the vast majority of code you'd want to run on a GPU would be easily written in C anyway, although I'm sure that eventually it will become a wanted feature. AMD will probably add support for it about that time, maybe in a generation or 2.

              Comment


              • #17
                Does this depend on the FGLRX driver and a specific version of that in that case?

                Comment


                • #18
                  why? Do you suddenly need openCL to live? I'd expect another year to pass before openCL is used by widespread software. If your card can do what you need right now, keep it.

                  Comment


                  • #19
                    I can't make sense of that video. NVidia announced C++ support. You claim this won't happen? Sounds a bit strange to me, so I suppose this video is by people who spread too much FUD or something.
                    As the previous posters said, the issue is that Nvidia is having troubles producing the actual cards. This video showed the (failed) damage control they are trying to do: the features sound great on paper, but they are useless without the actual hardware to run them - and the hardware is nowhere to be found (edit: the card and the videos they showed are obviously fake).

                    Every indication is that they won't release sooner than Q1 or Q2 of 2010.

                    Personally, I don't doubt that we'll see some form of C++ running on those GPUs. However, I believe this will be *far* from what C++ will look like on the CPU - I doubt we'll see stuff like multiple inheritance (or even single virtual inheritance), partial template specialization or any of the other features that make C++ more than C-with-classes. Don't expect to take your favorite C++ application (e.g. Firefox) and compile it for the GPU anytime soon (before Larbee at least).

                    Note that DX11 already supports C-with-classes, so Nvidia won't be bringing anything new to the table...

                    Comment


                    • #20
                      Eh, screw C++. I don't need a fully general purpose GPU, I just want apps that make sense to accelerate, to be accelerated. Like video decoding...

                      Comment

                      Working...
                      X