Announcement
Collapse
No announcement yet.
GNOME Shell + Mutter 40 Beta Released With Wayland Improvements, Overview Redesign
Collapse
X
-
oiaohm this is not an answer because by the time I read the documents you linked this discussion will be old enough that going on would be meaningless, but thank you very much for all the sources!
- Likes 1
-
Originally posted by JackLilhammers View PostNo offense, but some of your sentences make little sense. Are you using a translator?
Originally posted by JackLilhammers View PostAbout CUDA being closed source, of course being able to validate a software would be better.
Open source is still not a garantee of anything, but it's definitely a better practice.
About the performance, CUDA still has superior performance when it's supported.
Also, at the moment there's just more documentation and stuff for CUDA.
Just search CUDA and ROCm on Google and count the results...
Now that the dust from Nvidia's unveiling of its new Ampere AI chip has settled, let's take a look at the AI chip market behind the scenes and away from the spotlight
The performance arguement is not a straight forwards. AMD is doing proper integration with FPGA. So its intel with their offering. Both AMD and Intel own a FPGA firm these days Nvidia does not own a FPGA firm.
We are seeing a lot of the historic CUDA market disappear into FPGA and custom ASIC. Remember the custom ASIC the parties using this have 100 percent validation all the way down into the silicon on what is there. Result of the FPGA and custom ASIC can be less power for the same performance compared to throwing a generic GPU units at the problem. CUDA superior performance is questionable these days like it or not.
Originally posted by JackLilhammers View PostI kind of understand what you're saying. Of course integrated gpu are dominant, but Nvidia is not in that market.
It's like saying that there are more city cars than racing cars...
Originally posted by JackLilhammers View PostI don't think so. What I'm saying is that IMHO if hypothetically Linux were to ditch Nvidia support, there would be way more Nvidia users ditching Linux than Linux users ditching Nvidia.
That's because people usually don't buy their hardware for Linux, but come to Linux from Windows.
There is a big case where people are not using Linux desktop because their employer will not allow it.
Originally posted by JackLilhammers View PostSuch as?
I'm taking your word for it but I'm curious.
Again, assuming you're correct, I'd like to know on what ground you say this.
This is the Australian requirement USA, UK China.... requirement is about same.
Security Control: 0428; Revision: 6; Updated: Sep-18; Applicability: O, P, S, TS; Priority: Must Systems are configured with:
1 a session or screen lock that: activates after a maximum of 15 minutes of user inactivity or if manually activated by the user
2 completely conceals all information on the screen
I could go on with a list of over 1000 points that X11 protocol does not allow you to-do that you require to operate to government requirements on computer security. There are a huge number of users who cannot use Linux not because they don't want to use Linux but if they use Linux they will breach contract they have signed on security requirements. Yes those users are normally using Intel/AMD integrated GPUs.
So dropping Nvidia support loses one set of users but the working Wayland stuff will allow another set of users. Yes the way eglstreams Nvidia offered up also fails to meet the screen lock requirements of the Australian rules so Wayland Nvidia is not usable and be on the right side of a contract to handle different government tasks.
Yes government requirements force users to use particular software so while governments are windows dominated due to Linux having crappy desktop security from X11 this is going to keep on happening as well.
Originally posted by JackLilhammers View PostIs it? Do you have any numbers?
I wasn't able to find any real data, and tech related news don't talk about egpus becoming the norm...
The socket here that the writer did not identify is a server socket for 8x pcie 3.0. But we are going to see more this. People want thin and light laptops. Thin and light laptops don't match up to really power GPU inside that need lots of cooling. So Asus and others are moving in the direction of external. There are some of the custom laptops that are PCIe external ports on them. Basically external GPU is going to come more common.
Leave a comment:
-
No offense, but some of your sentences make little sense. Are you using a translator?
That said, I'll answer to what I understood
About CUDA being closed source, of course being able to validate a software would be better.
Open source is still not a garantee of anything, but it's definitely a better practice.
About the performance, CUDA still has superior performance when it's supported.
Also, at the moment there's just more documentation and stuff for CUDA.
Just search CUDA and ROCm on Google and count the results...
Originally posted by oiaohm View PostLinux users are not always that average. The reality here average users of laptops don't buy they laptops their boss buys their laptop. Of course their boss by custom big batches of laptops. Custom laptops is a good laugh every year for ever retail laptop sold there are 10 custom laptops enterprises sold. Custom laptops is your dominate laptops made. So custom laptops is in fact the norm. Its also fact that over 80% of all laptops made are either integrated AMD or integrated Intel graphics only. A GPU inside is rare.
It's like saying that there are more city cars than racing cars...
Originally posted by oiaohm View PostThat is a bogus arguement.
That's because people usually don't buy their hardware for Linux, but come to Linux from Windows.
Originally posted by oiaohm View PostThere are government and companies contracted by government that due to the security rules cannot use X11 on a desktop.
I'm taking your word for it but I'm curious.
Originally posted by oiaohm View PostLike it or not may of these areas with high security requirements you want to run CUDA on desktop you have to run Windows because X11 under Linux is not allowed not high enough security rating. Yes Nvidia providing a headless mode to CUDA on Linux was that CUDA on Linux could still be used in those markets. So there are huge possible markets where Linux desktop cannot be used purely because of X11 problems.
Originally posted by oiaohm View PostExternal GPU is coming more the norm in your feature list on your custom enterprise laptops. Particularly now we are see 8x pci-e.
I wasn't able to find any real data, and tech related news don't talk about egpus becoming the norm...
Please, please, write a little better. Otherwise it's really hard to answer
- Likes 1
Leave a comment:
-
Originally posted by JackLilhammers View PostCan you?
I mean, maybe you can, but most of the times, the users of something do not fully understand the inner working of the something they use and work with.
You can drive a car without knowing how to fix it.
Yes you don't know how to fix a car you employ mechanic to fix it. For mechanic to be able to do his job you need right to repair stuff that he can get tools and information to do is job to validate that you car working right. So you got car validated by paying someone but for them to be able to validate they need something.
"Next CUDA can you in fact proper validate that it doing its maths right? Closed source is doubled sided problem."
Lets say I want to pay a PHD in Mathematics to properly validate my CUDA setup the reality is they cannot the closed source blobs get in way. I want to pay a PHD in Mathematics to properly validate my ROCm setup they can in fact.
So software and car you can employ the right person and get validation but this requires the right levels of access from the maker. Nvidia does not provide what is required to make a validated solution. Reality Nvidia has mostly been chosen for compute due to having massively higher performance than the other options. But without the performance advantage the question of validation comes up.
Yes the way Nvidia crippled crypto currency mining recently shows that Nvidia could also do that to you for your CUDA without notice. So you processing is not validated and can be sabotaged by next firmware update to card you may not be able to reverse.
Originally posted by JackLilhammers View PostAlso, last time I checked, average users don't buy custom laptops.
Originally posted by JackLilhammers View PostSome Linux users will move away from Nvidia, sure, but it's likely that way more users will move away from Linux.
There are government and companies contracted by government that due to the security rules cannot use X11 on a desktop. So every user using Nvidia on lInux disappears the number of users using Intel graphics who will be able to use Linux desktop and laptops well and truly outnumbers them.
Like it or not may of these areas with high security requirements you want to run CUDA on desktop you have to run Windows because X11 under Linux is not allowed not high enough security rating. Yes Nvidia providing a headless mode to CUDA on Linux was that CUDA on Linux could still be used in those markets. So there are huge possible markets where Linux desktop cannot be used purely because of X11 problems.
Originally posted by JackLilhammers View PostLike custom laptops, external gpus are not the norm.
Leave a comment:
-
Originally posted by pal666 View Postgamers don't stream. streamers stream. and on our planet every videocard can stream
Of course every decent hardware can stream, but nvenc is the better option at the moment.
Originally posted by pal666 View Postyou are again confusing gamers with nvidiots. gamers with brains don't use software upscaling. and software upscaling is not hardware dependent, it will look the same on any hardware
It's software, but runs on dedicated hardware.
Also, I have a feeling that we won't see their neural networks and ai based algorightms on different hardware. Can't imagine why...
Btw, come up with a better implementation of ai based upscaling, fantastic! Kudos to them!
Until then, nvidia has a competitive advantage and denying it won't make it disappear.
Originally posted by pal666 View Postlol, most users are using android linux without novideo.
Originally posted by pal666 View Postbelief in novideo specialty is what makes you an nvidiotOriginally posted by pal666 View Postbecause average human is not an nvidiot, unlike you
- Likes 2
Leave a comment:
-
Originally posted by JackLilhammers View PostSome Linux users will move away from Nvidia, sure, but it's likely that way more users will move away from Linux.
Leave a comment:
-
Originally posted by JackLilhammers View PostSure, because gamers don't stream. Ping me when you get back to the Earth. You know, the planet with Twitch...
Originally posted by JackLilhammers View PostYes, dlss2 it's software upscaling, but it's awesome and gamers use it.
Originally posted by JackLilhammers View PostAlso, you keep insulting people for no reason at all, but whatever...Last edited by pal666; 26 February 2021, 07:17 PM.
Leave a comment:
-
Originally posted by oiaohm View PostQaridarium if I am using gamescope. I am not needing AMD Super resolution.
SteamOS session compositing window manager. Contribute to ValveSoftware/gamescope development by creating an account on GitHub.
Using gamescope method its possible to be using a custom filter per application from valve that is for the game that is not dependant on what video card drivers I have installed other than not Nvidia at this stable.
Reality here if raytracing tanks my framerate and I have a AMD card and I am on Linux I don't need AMD super resolution to upscale the output of the game.
Why does game output scaling in the first place have to be a vendor unique thing there is no need for it to be the case.
DLSS2 is about KI upscalling to have better quality than boring linear interpolation output scaling.
also the AMD super resolution technique is unlike nvidia DLSS2 only a software solution the only hardware what is important to make it fast is the "infinity cache" also AMD super resolution don'T use deep learning KI upscalling but much more simple algorithms.
and you are right there is no need for vendor lock in walled garden bullshit from Nvidia...
and you are right on linux with opensource software and wayland there is already solutions for this like you said "gamescope"
but i don't unterstand why you are Anti AMD here it is 100% clear that the AMD super resolution will be OpenSource to.
Leave a comment:
-
Originally posted by oiaohm View PostNext CUDA can you in fact proper validate that it doing its maths right? Closed source is doubled sided problem.
I mean, maybe you can, but most of the times, the users of something do not fully understand the inner working of the something they use and work with.
You can drive a car without knowing how to fix it.
Originally posted by oiaohm View PostThis is also not 100 percent true any more. You do get laptops custom supply with AMD Radeon RX 6700M.
Also, last time I checked, average users don't buy custom laptops.
Originally posted by oiaohm View PostThe reality hear if Nvidia is not going to provide proper Wayland compatible drivers going forwards. Linux users will have to walk away.
Originally posted by oiaohm View PostAlso please note a lot of the high performance GPU usage in laptops is moving away from dedicated GPU in laptop to external GPU due to cooling issues. External GPU in a box is just a desktop GPU and simple to change.
Last edited by JackLilhammers; 26 February 2021, 09:57 AM.
- Likes 2
Leave a comment:
-
Originally posted by pal666 View Postthey are selling points for nvidiots, not for gamers. nvenc has no relation to gaming and isn't better than competition, dlss is software upscaling for imbeciles, current raytracing is "pay to tank both fps and image quality"
Yes, dlss2 it's software upscaling, but it's awesome and gamers use it.
I'd say you need a little reality check.
Also, you keep insulting people for no reason at all, but whatever...
- Likes 2
Leave a comment:
Leave a comment: