Announcement

Collapse
No announcement yet.

NVIDIA Adding Experimental Vulkan Support For Executing CUDA Binaries

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by Qaridarium View Post
    as soon as WebGPU Compute hit the market CUDA will be gone...
    Few people run massive compute applications on a browser. Most run on bare-metal.

    Comment


    • #42
      Originally posted by birdie View Post

      CUDA can and will be reversed engineered and made to work with AMD and Intel GPUs if NVIDIA ceases to exist. This has already happened to Glide (proprietary OpenGL like API from 3dfx) and Molten (proprietary graphical API from AMD). There's nothing magical about CUDA however given that CUDA works so well with tons of NVIDIA GPUs no one bothers.
      So what?
      This looks like a comment on my comment, since it is quoting me, but it doesn't seem to address any of the points that i brought up.
      Last edited by pracedru; 12 May 2021, 06:23 AM.

      Comment


      • #43
        Originally posted by Qaridarium View Post

        dude i do monitor all computer tech for over like 25 years...
        You're monitoring crap. NVIDIA's profits have been increasing recently and show no signs of stopping. In fact unlike AMD NVIDIA has almost always been extremely profitable.

        Originally posted by Qaridarium View Post
        i was right in predicting the future so many times i can not even count it anymore.
        After the fact?

        Originally posted by Qaridarium View Post
        i told you that you think i am wrong because you have no imagination of the future.
        you focus on what is right now and what is the past. i give you that thats a skill to.

        but your ability to extrapolate the future is plain and simple non-existent.
        Care to show your predictions? No, nothing?

        Originally posted by Qaridarium View Post
        all i said is true you pay 1100€ more to get an nvidia card thats alone should make you wonder.
        that amd is already faster in FPS per watt should make you wonder.
        How does this relate to this discussion? Have you checked the recent Steam stats? NVIDIA is again gaining the market share from AMD. NVIDIA is selling 5 times more RTX cards than AMD sells it 6000 cards. https://www.hardwaretimes.com/nvidia...n-the-rx-6800/

        Originally posted by Qaridarium View Post
        that there is some kind of raytracing implementation in what amd is much faster than nvidia should make you wonder
        of course nvidia will push any game developer on earth to make sure the implementation who is in favor of nvidia is used
        but the possibility that the developers ignor the huge success of playstation5 and xbox is ZERO...
        This is actually pure lies and you understand crap about ray tracing. Only NVIDIA has hardware implementation of BVH - AMD uses shaders for that and in most hardcore RTX tests RX 6000 cards are up to 3 times slower than NVIDIA's: https://www.pcgameshardware.de/Raytr...marks-1371125/ https://www.hardwareluxx.de/index.ph...g-zukunft.html

        Originally posted by Qaridarium View Post
        Nvidia is on 8nm Samsung node for a long long time and RDNA3 with 5nm will be their downfall.
        to make it 40-60% faster at the same price at the same power consumption is absolutely no problem with 5nm.
        What??? You think AMD somehow prevents NVIDIA from having its GPU built on TSMC fabs? Did you know btw that TSMC produces enterprise NVIDIA A100 GPUs? No?

        Originally posted by Qaridarium View Post
        the linux kernel is removing Nvidia-closed-source-only stuff means Nvidia will have a hard time to do closed source drivers on linux. this alone means that nvidia will not be a good option for linux any longer.
        What??? NVIDIA doesn't care about an OS with 1% market share whose users generally don't even play games. Also, where the fook are quotes and citations? I've never head Linux kernel guys removing any NVIDIA code from the kernel. You're making stuff up and quite egregiously so.

        Originally posted by Qaridarium View Post
        as soon as WebGPU Compute hit the market CUDA will be gone...
        I've blacklisted you. There's not a single proven fact in your entire post and most are applicable only to an alternative universe.
        Last edited by birdie; 12 May 2021, 10:02 AM.

        Comment


        • #44
          WebGPU compute ahahahahah. When entire graphics APIs like Vulkan or DX12 move to super low level that literally shouldn't ever be exposed to web.

          Qaridarium is so troll i can't read his post without laughing. Nvidia used Samsung, because if Nvidia didn't use Samsung there would be 5-6 times less GPUs on market, when already there is not enough GPUs on market. That is realiticly genius move by Nvidia as they got capacity to sell a lot of hardware, at probably cheaper production prices of Samsung. Imagine a world when 3090 costs 10k$ - yup that would happen if Nvidia was using TSMC due to no capacity.

          Linux won't remove Nvidia bits in kernel because it would be self killing move. Oh great, you want your system to have even less market share in desktops, and a lot less in supercomputers as well.

          The reason why CUDA is popular is because it is easy to develop for, has good support for a lot of years (since 8800GTX), and a ton of libraries written for it, often with Nvidia's support itself. Neither OpenCL or Vulkan compute are comparably easy to CUDA, both have less libraries, OpenCL has shaky past with versions and Vulkan compute is relativly new and definitly the hardest.

          Comment


          • #45
            Originally posted by birdie View Post

            **BUT ANDROID IS CLOSED SO IT DOESN'T COUNT** (paraphrased to what I read)
            Android went with the route of licensing their technology to third parties, and at the time was much more open than their competitors while providing the most viable competition to the market leader (iPhone). That is why they succeeded where others, most notably Windows Mobile, failed.

            As a company, the closed route is bound to hurt you long-term while you may gain great short term benefits. That does not mean you cannot capture a niche market or two. That does not mean you cannot comfortably live on that niche, like Apple does.

            But if you look at other things, like the IP protocol (won because of open access and easy implementation) and the PC market in the nineties, there is a reason the PC won over everything else; Open standards and platforms. Not completely, true, but open enough to allow an ecosystem to live on top of those computers.

            With CUDA there is no chance for an ecosystem to grow on top of it, that is not reliant on a single vendor, the allmighty Nvidia.

            Comment


            • #46
              Originally posted by zxy_thf View Post
              Can you list the marketshare of CUDA in General Computing?
              No, you dare not. CUDA by itself is the punch-in-your-face example against your "Open" BS.
              Windows did 20 years ago, and still dominate in desktop and gaming market.
              Coping the fact that Steam Linux never reaches 1% of marketshare.
              25 years ago, 3DFX and their Glide API ruled all games. 20 years ago, OpenGL and Direct3D had killed them off completely. 30 years ago, Windows allowed anyone to develop for their platform as long as they paid for the OS, and all of a sudden you did not need to buy expensive extra developer licenses anymore. Windows in effect created a mostly open ecosystem with a fence around it - pass the fence, play however much you want. Linux ecosystem doesn't even have a fence. And that is why Linux beat the competition on server and embedded devices, why Android is still using it as a kernel.

              Originally posted by zxy_thf View Post
              Again, it's nice to have a functioning brain. Until you do you won't have any idea what's happening in GPU computing.
              Your "Open" propaganda never yields users of the open system because anyone who has used GPU computing will soon realize NVIDIA is pain-free, AMD is frustrating (RoCM on Navi when?), and Intel is nowhere to buy (Xe cards when?).

              Novices who believed the Open propaganda, tried and headed to NVIDIA. End of the story.
              Ad hominem attacks? That is your masterful plan to convince me that your beloved CUDA is not a giant limitation? Just because a proprietary technology at the moment is better than the non-proprietary alternatives, it may cease to be so at any moment in time. Today, tomorrow, in five-ten years. Intel was dominant for years and now AMD is leading the pack. Nvidia had the best Linux drivers and now AMD provides even better ones. MUCH better ones. Open is not a guarantee for success but it is a prerequisite for a truly open ecosystem.

              Originally posted by zxy_thf View Post
              "Being Open" by itself never contributes to user experiences or marketshare directly.
              It's the ecosystem contributing to the marketshare, and if you have every waken up from your Android wet dream you would instantly realize it's manufactures' cheap phones, NOT Google, is the determining factor.
              Who f-ing care whether the system releases its source code or not when buying a phone? Is everyone a software engineer on this planet?
              Actually, I care. If I buy an expensive piece of equipment, I want to own said equipment. If I want to light it on fire or paint it blue, I should be able to. AMD allows this to a far greater extent than Nvidia does. Guess you're one of the people who think Right to Repair is useless, too.

              Originally posted by zxy_thf View Post
              By far, it's very safe to say NVIDIA is the sole pillar of the ecosystem, and the remaining ones are hopelessly building their own ones.
              An ecosystem with a single pillar is a walled garden. Beautiful, Zen even - yet one careless move, and the walls come crumbling down, destroying everything within said garden. They rarely last a long time, but as every rule ever made, there are exceptions.

              Comment


              • #47
                Originally posted by wertigon View Post

                Android went with the route of licensing their technology to third parties, and at the time was much more open than their competitors while providing the most viable competition to the market leader (iPhone). That is why they succeeded where others, most notably Windows Mobile, failed.
                Windows Mobile failed for completely different reasons. Bad marketing, bad API compatibility (Microsoft completely broke it when they updated from WM 7 to 8.5m, and then to 10), bad usability, bad applications support. It had nothing to do with WM openness. Android 12 on the other hand allows to run most Android 4.0 applications and even older ones but it requires recompilation and minor tweaks.

                Again,
                • The Android Linux kernel is extremely closed.
                • The Android userpsace without proprietary components is barely usable.
                It's cringeworthy seeing how you continue to navigate around the fact that Android is not free software and you've now resorted to "licensing" as if Microsoft didn't license WM to OEMs. Hint: they did.

                Originally posted by wertigon View Post
                As a company, the closed route is bound to hurt you long-term while you may gain great short term benefits. That does not mean you cannot capture a niche market or two. That does not mean you cannot comfortably live on that niche, like Apple does.

                But if you look at other things, like the IP protocol (won because of open access and easy implementation) and the PC market in the nineties, there is a reason the PC won over everything else; Open standards and platforms. Not completely, true, but open enough to allow an ecosystem to live on top of those computers.
                There's no historical precedence and no validity to anything you're claiming here.

                Originally posted by wertigon View Post
                With CUDA there is no chance for an ecosystem to grow on top of it, that is not reliant on a single vendor, the allmighty Nvidia.
                CUDA has been growing in usage a lot faster than all the other competing open computing standards, e.g. OpenCL or Vulkan compute. You have zero facts to prove anything that you're saying. We now have at least five people in this thread who, against all the evidence that "openness" means nothing for the success of a computing platform, continue to argue the opposite without providing any arguments.

                Normally such people are called buffoons I'm just curious why their concentration here on Phoronix is so high. Perhaps it's due to the very marginal state of Linux, so people who choose it believe they are smarter than everyone else and continue to share their "wisdom" and entrepreneurship "skills" without ever running a major semiconductor company.
                Last edited by birdie; 12 May 2021, 10:00 AM.

                Comment


                • #48
                  Originally posted by piotrj3 View Post
                  WebGPU compute ahahahahah. When entire graphics APIs like Vulkan or DX12 move to super low level that literally shouldn't ever be exposed to web.

                  Qaridarium is so troll i can't read his post without laughing. Nvidia used Samsung, because if Nvidia didn't use Samsung there would be 5-6 times less GPUs on market, when already there is not enough GPUs on market. That is realiticly genius move by Nvidia as they got capacity to sell a lot of hardware, at probably cheaper production prices of Samsung. Imagine a world when 3090 costs 10k$ - yup that would happen if Nvidia was using TSMC due to no capacity.

                  Linux won't remove Nvidia bits in kernel because it would be self killing move. Oh great, you want your system to have even less market share in desktops, and a lot less in supercomputers as well.

                  The reason why CUDA is popular is because it is easy to develop for, has good support for a lot of years (since 8800GTX), and a ton of libraries written for it, often with Nvidia's support itself. Neither OpenCL or Vulkan compute are comparably easy to CUDA, both have less libraries, OpenCL has shaky past with versions and Vulkan compute is relativly new and definitly the hardest.
                  I'm so thankful we have rational people here. It's like a breath of fresh air.

                  Comment


                  • #49
                    Originally posted by birdie View Post
                    Android is not free software
                    And it doesn't matter that it is not free software since their APIs were free to develop for and it cost nothing to put on a new phone product. Low barrier to entry = huge adoption rate. Meanwhile WM didn't sell because... Carriers thought the relationship between Skype and Microsoft toxic for them.


                    Originally posted by birdie View Post
                    There's no historical precedence and no validity to anything you're claiming here.
                    ... Can you please tell me why we started to use the IP protocol over IPX, AppleTalk etc then? Denying the historic precedence doesn't make you right you know.

                    Again, low barriers to entry leads to a fast adoption curve, if the tech is useful. A walled garden is the Antithesis of that. And Open Source is a low barrier to entry. It does not guarantee success, but it is a prerequisite.

                    Originally posted by birdie View Post
                    CUDA has been growing in usage a lot faster than all the other competing open computing standards, e.g. OpenCL or Vulkan compute. You have zero facts to prove anything that you're saying. We now have at least five people in this thread who, against all the evidence that "openness" means nothing for the success of a computing platform, continue to argue the opposite without providing any arguments.
                    Actually, we have provided the evidence. Take it another way; Would Linux have the adoption it did, if it *had not* been an open Kernel?

                    Originally posted by birdie View Post
                    Normally such people are called buffoons I'm just curious why their concentration here on Phoronix is so high. Perhaps it's due to the very marginal state of Linux, so people who choose it believe they are smarter than everyone else and continue to share their "wisdom" and entrepreneurship "skills" without ever running a major semiconductor company.
                    How come the pro-closed source camp is the one resorting to name calling?

                    Comment


                    • #50
                      Originally posted by tildearrow View Post
                      Few people run massive compute applications on a browser. Most run on bare-metal.
                      did you read what i wrote ? you can run webGPU outside of the browser they just use the browser to standarize the API

                      after this is done they use it everywhere.
                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X