Originally posted by Qaridarium
View Post
Announcement
Collapse
No announcement yet.
NVIDIA Adding Experimental Vulkan Support For Executing CUDA Binaries
Collapse
X
-
Originally posted by birdie View Post
CUDA can and will be reversed engineered and made to work with AMD and Intel GPUs if NVIDIA ceases to exist. This has already happened to Glide (proprietary OpenGL like API from 3dfx) and Molten (proprietary graphical API from AMD). There's nothing magical about CUDA however given that CUDA works so well with tons of NVIDIA GPUs no one bothers.
This looks like a comment on my comment, since it is quoting me, but it doesn't seem to address any of the points that i brought up.Last edited by pracedru; 12 May 2021, 06:23 AM.
Comment
-
Originally posted by Qaridarium View Post
dude i do monitor all computer tech for over like 25 years...
Originally posted by Qaridarium View Posti was right in predicting the future so many times i can not even count it anymore.
Originally posted by Qaridarium View Posti told you that you think i am wrong because you have no imagination of the future.
you focus on what is right now and what is the past. i give you that thats a skill to.
but your ability to extrapolate the future is plain and simple non-existent.
Originally posted by Qaridarium View Postall i said is true you pay 1100€ more to get an nvidia card thats alone should make you wonder.
that amd is already faster in FPS per watt should make you wonder.
Originally posted by Qaridarium View Postthat there is some kind of raytracing implementation in what amd is much faster than nvidia should make you wonder
of course nvidia will push any game developer on earth to make sure the implementation who is in favor of nvidia is used
but the possibility that the developers ignor the huge success of playstation5 and xbox is ZERO...
Originally posted by Qaridarium View PostNvidia is on 8nm Samsung node for a long long time and RDNA3 with 5nm will be their downfall.
to make it 40-60% faster at the same price at the same power consumption is absolutely no problem with 5nm.
Originally posted by Qaridarium View Postthe linux kernel is removing Nvidia-closed-source-only stuff means Nvidia will have a hard time to do closed source drivers on linux. this alone means that nvidia will not be a good option for linux any longer.
Originally posted by Qaridarium View Postas soon as WebGPU Compute hit the market CUDA will be gone...Last edited by birdie; 12 May 2021, 10:02 AM.
- Likes 1
Comment
-
WebGPU compute ahahahahah. When entire graphics APIs like Vulkan or DX12 move to super low level that literally shouldn't ever be exposed to web.
Qaridarium is so troll i can't read his post without laughing. Nvidia used Samsung, because if Nvidia didn't use Samsung there would be 5-6 times less GPUs on market, when already there is not enough GPUs on market. That is realiticly genius move by Nvidia as they got capacity to sell a lot of hardware, at probably cheaper production prices of Samsung. Imagine a world when 3090 costs 10k$ - yup that would happen if Nvidia was using TSMC due to no capacity.
Linux won't remove Nvidia bits in kernel because it would be self killing move. Oh great, you want your system to have even less market share in desktops, and a lot less in supercomputers as well.
The reason why CUDA is popular is because it is easy to develop for, has good support for a lot of years (since 8800GTX), and a ton of libraries written for it, often with Nvidia's support itself. Neither OpenCL or Vulkan compute are comparably easy to CUDA, both have less libraries, OpenCL has shaky past with versions and Vulkan compute is relativly new and definitly the hardest.
- Likes 1
Comment
-
Originally posted by birdie View Post
**BUT ANDROID IS CLOSED SO IT DOESN'T COUNT** (paraphrased to what I read)
As a company, the closed route is bound to hurt you long-term while you may gain great short term benefits. That does not mean you cannot capture a niche market or two. That does not mean you cannot comfortably live on that niche, like Apple does.
But if you look at other things, like the IP protocol (won because of open access and easy implementation) and the PC market in the nineties, there is a reason the PC won over everything else; Open standards and platforms. Not completely, true, but open enough to allow an ecosystem to live on top of those computers.
With CUDA there is no chance for an ecosystem to grow on top of it, that is not reliant on a single vendor, the allmighty Nvidia.
- Likes 1
Comment
-
Originally posted by zxy_thf View PostCan you list the marketshare of CUDA in General Computing?
No, you dare not. CUDA by itself is the punch-in-your-face example against your "Open" BS.
Windows did 20 years ago, and still dominate in desktop and gaming market.
Coping the fact that Steam Linux never reaches 1% of marketshare.
Originally posted by zxy_thf View PostAgain, it's nice to have a functioning brain. Until you do you won't have any idea what's happening in GPU computing.
Your "Open" propaganda never yields users of the open system because anyone who has used GPU computing will soon realize NVIDIA is pain-free, AMD is frustrating (RoCM on Navi when?), and Intel is nowhere to buy (Xe cards when?).
Novices who believed the Open propaganda, tried and headed to NVIDIA. End of the story.
Originally posted by zxy_thf View Post"Being Open" by itself never contributes to user experiences or marketshare directly.
It's the ecosystem contributing to the marketshare, and if you have every waken up from your Android wet dream you would instantly realize it's manufactures' cheap phones, NOT Google, is the determining factor.
Who f-ing care whether the system releases its source code or not when buying a phone? Is everyone a software engineer on this planet?
Originally posted by zxy_thf View PostBy far, it's very safe to say NVIDIA is the sole pillar of the ecosystem, and the remaining ones are hopelessly building their own ones.
Comment
-
Originally posted by wertigon View Post
Android went with the route of licensing their technology to third parties, and at the time was much more open than their competitors while providing the most viable competition to the market leader (iPhone). That is why they succeeded where others, most notably Windows Mobile, failed.
Again,- The Android Linux kernel is extremely closed.
- The Android userpsace without proprietary components is barely usable.
Originally posted by wertigon View PostAs a company, the closed route is bound to hurt you long-term while you may gain great short term benefits. That does not mean you cannot capture a niche market or two. That does not mean you cannot comfortably live on that niche, like Apple does.
But if you look at other things, like the IP protocol (won because of open access and easy implementation) and the PC market in the nineties, there is a reason the PC won over everything else; Open standards and platforms. Not completely, true, but open enough to allow an ecosystem to live on top of those computers.
Originally posted by wertigon View PostWith CUDA there is no chance for an ecosystem to grow on top of it, that is not reliant on a single vendor, the allmighty Nvidia.
Normally such people are called buffoons I'm just curious why their concentration here on Phoronix is so high. Perhaps it's due to the very marginal state of Linux, so people who choose it believe they are smarter than everyone else and continue to share their "wisdom" and entrepreneurship "skills" without ever running a major semiconductor company.Last edited by birdie; 12 May 2021, 10:00 AM.
Comment
-
Originally posted by piotrj3 View PostWebGPU compute ahahahahah. When entire graphics APIs like Vulkan or DX12 move to super low level that literally shouldn't ever be exposed to web.
Qaridarium is so troll i can't read his post without laughing. Nvidia used Samsung, because if Nvidia didn't use Samsung there would be 5-6 times less GPUs on market, when already there is not enough GPUs on market. That is realiticly genius move by Nvidia as they got capacity to sell a lot of hardware, at probably cheaper production prices of Samsung. Imagine a world when 3090 costs 10k$ - yup that would happen if Nvidia was using TSMC due to no capacity.
Linux won't remove Nvidia bits in kernel because it would be self killing move. Oh great, you want your system to have even less market share in desktops, and a lot less in supercomputers as well.
The reason why CUDA is popular is because it is easy to develop for, has good support for a lot of years (since 8800GTX), and a ton of libraries written for it, often with Nvidia's support itself. Neither OpenCL or Vulkan compute are comparably easy to CUDA, both have less libraries, OpenCL has shaky past with versions and Vulkan compute is relativly new and definitly the hardest.
Comment
-
Originally posted by birdie View PostAndroid is not free software
Originally posted by birdie View PostThere's no historical precedence and no validity to anything you're claiming here.
Again, low barriers to entry leads to a fast adoption curve, if the tech is useful. A walled garden is the Antithesis of that. And Open Source is a low barrier to entry. It does not guarantee success, but it is a prerequisite.
Originally posted by birdie View PostCUDA has been growing in usage a lot faster than all the other competing open computing standards, e.g. OpenCL or Vulkan compute. You have zero facts to prove anything that you're saying. We now have at least five people in this thread who, against all the evidence that "openness" means nothing for the success of a computing platform, continue to argue the opposite without providing any arguments.
Originally posted by birdie View PostNormally such people are called buffoons I'm just curious why their concentration here on Phoronix is so high. Perhaps it's due to the very marginal state of Linux, so people who choose it believe they are smarter than everyone else and continue to share their "wisdom" and entrepreneurship "skills" without ever running a major semiconductor company.
Comment
-
Originally posted by tildearrow View PostFew people run massive compute applications on a browser. Most run on bare-metal.
after this is done they use it everywhere.Phantom circuit Sequence Reducer Dyslexia
Comment
Comment