Announcement

Collapse
No announcement yet.

GNOME Shell + Mutter 40 Beta Released With Wayland Improvements, Overview Redesign

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • JackLilhammers
    replied
    oiaohm this is not an answer because by the time I read the documents you linked this discussion will be old enough that going on would be meaningless, but thank you very much for all the sources!

    Leave a comment:


  • oiaohm
    replied
    Originally posted by JackLilhammers View Post
    No offense, but some of your sentences make little sense. Are you using a translator?
    Sorry this is native Australian English with a person who started with non functional level dyslexia. Treatment gets you so far. Also add in a lot of knowledge in some of these fields I can short hand stuff that can make life hard I know.

    Originally posted by JackLilhammers View Post
    About CUDA being closed source, of course being able to validate a software would be better.
    Open source is still not a garantee of anything, but it's definitely a better practice.
    About the performance, CUDA still has superior performance when it's supported.
    Also, at the moment there's just more documentation and stuff for CUDA.
    Just search CUDA and ROCm on Google and count the results...
    Its not that straight forwards of a google search.

    Now that the dust from Nvidia's unveiling of its new Ampere AI chip has settled, let's take a look at the AI chip market behind the scenes and away from the spotlight


    The performance arguement is not a straight forwards. AMD is doing proper integration with FPGA. So its intel with their offering. Both AMD and Intel own a FPGA firm these days Nvidia does not own a FPGA firm.

    We are seeing a lot of the historic CUDA market disappear into FPGA and custom ASIC. Remember the custom ASIC the parties using this have 100 percent validation all the way down into the silicon on what is there. Result of the FPGA and custom ASIC can be less power for the same performance compared to throwing a generic GPU units at the problem. CUDA superior performance is questionable these days like it or not.


    Originally posted by JackLilhammers View Post
    I kind of understand what you're saying. Of course integrated gpu are dominant, but Nvidia is not in that market.
    It's like saying that there are more city cars than racing cars...
    No this is you have missed it.

    Originally posted by JackLilhammers View Post
    I don't think so. What I'm saying is that IMHO if hypothetically Linux were to ditch Nvidia support, there would be way more Nvidia users ditching Linux than Linux users ditching Nvidia.
    That's because people usually don't buy their hardware for Linux, but come to Linux from Windows.
    No mistake. Step back and look at the big picture. Majority of those coming from Windows to Linux will not Nvidia but have Intel or AMD integrated. This is the way its been for quite some time. Laptop with Nvidia GPU the Nvidia GPU is rarely hooked up to the outputs directly.

    There is a big case where people are not using Linux desktop because their employer will not allow it.


    Originally posted by JackLilhammers View Post
    Such as?
    I'm taking your word for it but I'm curious.

    Again, assuming you're correct, I'd like to know on what ground you say this.
    This is going though the computer security requirements of countries like USA, Australia, UK...... for different government contracts. But there is one huge elephant.


    This is the Australian requirement USA, UK China.... requirement is about same.
    Security Control: 0428; Revision: 6; Updated: Sep-18; Applicability: O, P, S, TS; Priority: Must Systems are configured with:
    1 a session or screen lock that: activates after a maximum of 15 minutes of user inactivity or if manually activated by the user
    2 completely conceals all information on the screen
    Minor edited this to make it readable. You cannot in fact implement correctly functional screen locker with X11 protocol. Wayland compositor can in fact provide a screen locker that in fact works. That completely conceals the information on the screen include from all possible screen capture applications not just what is displayed on screen.

    I could go on with a list of over 1000 points that X11 protocol does not allow you to-do that you require to operate to government requirements on computer security. There are a huge number of users who cannot use Linux not because they don't want to use Linux but if they use Linux they will breach contract they have signed on security requirements. Yes those users are normally using Intel/AMD integrated GPUs.

    So dropping Nvidia support loses one set of users but the working Wayland stuff will allow another set of users. Yes the way eglstreams Nvidia offered up also fails to meet the screen lock requirements of the Australian rules so Wayland Nvidia is not usable and be on the right side of a contract to handle different government tasks.

    Yes government requirements force users to use particular software so while governments are windows dominated due to Linux having crappy desktop security from X11 this is going to keep on happening as well.

    Originally posted by JackLilhammers View Post
    Is it? Do you have any numbers?
    I wasn't able to find any real data, and tech related news don't talk about egpus becoming the norm...


    The socket here that the writer did not identify is a server socket for 8x pcie 3.0. But we are going to see more this. People want thin and light laptops. Thin and light laptops don't match up to really power GPU inside that need lots of cooling. So Asus and others are moving in the direction of external. There are some of the custom laptops that are PCIe external ports on them. Basically external GPU is going to come more common.

    Leave a comment:


  • JackLilhammers
    replied
    No offense, but some of your sentences make little sense. Are you using a translator?
    That said, I'll answer to what I understood

    About CUDA being closed source, of course being able to validate a software would be better.
    Open source is still not a garantee of anything, but it's definitely a better practice.
    About the performance, CUDA still has superior performance when it's supported.
    Also, at the moment there's just more documentation and stuff for CUDA.
    Just search CUDA and ROCm on Google and count the results...

    Originally posted by oiaohm View Post
    Linux users are not always that average. The reality here average users of laptops don't buy they laptops their boss buys their laptop. Of course their boss by custom big batches of laptops. Custom laptops is a good laugh every year for ever retail laptop sold there are 10 custom laptops enterprises sold. Custom laptops is your dominate laptops made. So custom laptops is in fact the norm. Its also fact that over 80% of all laptops made are either integrated AMD or integrated Intel graphics only. A GPU inside is rare.
    I kind of understand what you're saying. Of course integrated gpu are dominant, but Nvidia is not in that market.
    It's like saying that there are more city cars than racing cars...

    Originally posted by oiaohm View Post
    That is a bogus arguement.
    I don't think so. What I'm saying is that IMHO if hypothetically Linux were to ditch Nvidia support, there would be way more Nvidia users ditching Linux than Linux users ditching Nvidia.
    That's because people usually don't buy their hardware for Linux, but come to Linux from Windows.

    Originally posted by oiaohm View Post
    There are government and companies contracted by government that due to the security rules cannot use X11 on a desktop.
    Such as?
    I'm taking your word for it but I'm curious.

    Originally posted by oiaohm View Post
    Like it or not may of these areas with high security requirements you want to run CUDA on desktop you have to run Windows because X11 under Linux is not allowed not high enough security rating. Yes Nvidia providing a headless mode to CUDA on Linux was that CUDA on Linux could still be used in those markets. So there are huge possible markets where Linux desktop cannot be used purely because of X11 problems.
    Again, assuming you're correct, I'd like to know on what ground you say this.

    Originally posted by oiaohm View Post
    External GPU is coming more the norm in your feature list on your custom enterprise laptops. Particularly now we are see 8x pci-e.
    Is it? Do you have any numbers?
    I wasn't able to find any real data, and tech related news don't talk about egpus becoming the norm...

    Please, please, write a little better. Otherwise it's really hard to answer

    Leave a comment:


  • oiaohm
    replied
    Originally posted by JackLilhammers View Post
    Can you?
    I mean, maybe you can, but most of the times, the users of something do not fully understand the inner working of the something they use and work with.
    You can drive a car without knowing how to fix it.
    There are tools that are part what was made for sel4 that allows you to valid right down into the silicon. Yes it possible to fully validate ROCm from AMD.

    Yes you don't know how to fix a car you employ mechanic to fix it. For mechanic to be able to do his job you need right to repair stuff that he can get tools and information to do is job to validate that you car working right. So you got car validated by paying someone but for them to be able to validate they need something.

    "Next CUDA can you in fact proper validate that it doing its maths right? Closed source is doubled sided problem."

    Lets say I want to pay a PHD in Mathematics to properly validate my CUDA setup the reality is they cannot the closed source blobs get in way. I want to pay a PHD in Mathematics to properly validate my ROCm setup they can in fact.

    So software and car you can employ the right person and get validation but this requires the right levels of access from the maker. Nvidia does not provide what is required to make a validated solution. Reality Nvidia has mostly been chosen for compute due to having massively higher performance than the other options. But without the performance advantage the question of validation comes up.

    Yes the way Nvidia crippled crypto currency mining recently shows that Nvidia could also do that to you for your CUDA without notice. So you processing is not validated and can be sabotaged by next firmware update to card you may not be able to reverse.

    Originally posted by JackLilhammers View Post
    Also, last time I checked, average users don't buy custom laptops.
    Linux users are not always that average. The reality here average users of laptops don't buy they laptops their boss buys their laptop. Of course their boss by custom big batches of laptops. Custom laptops is a good laugh every year for ever retail laptop sold there are 10 custom laptops enterprises sold. Custom laptops is your dominate laptops made. So custom laptops is in fact the norm. Its also fact that over 80% of all laptops made are either integrated AMD or integrated Intel graphics only. A GPU inside is rare.

    Originally posted by JackLilhammers View Post
    Some Linux users will move away from Nvidia, sure, but it's likely that way more users will move away from Linux.
    That is a bogus arguement.

    There are government and companies contracted by government that due to the security rules cannot use X11 on a desktop. So every user using Nvidia on lInux disappears the number of users using Intel graphics who will be able to use Linux desktop and laptops well and truly outnumbers them.

    Like it or not may of these areas with high security requirements you want to run CUDA on desktop you have to run Windows because X11 under Linux is not allowed not high enough security rating. Yes Nvidia providing a headless mode to CUDA on Linux was that CUDA on Linux could still be used in those markets. So there are huge possible markets where Linux desktop cannot be used purely because of X11 problems.

    Originally posted by JackLilhammers View Post
    Like custom laptops, external gpus are not the norm.
    External GPU is coming more the norm in your feature list on your custom enterprise laptops. Particularly now we are see 8x pci-e.

    Leave a comment:


  • JackLilhammers
    replied
    Originally posted by pal666 View Post
    gamers don't stream. streamers stream. and on our planet every videocard can stream
    And what are those who stream their gaming sessions? Chimeras?
    Of course every decent hardware can stream, but nvenc is the better option at the moment.

    Originally posted by pal666 View Post
    you are again confusing gamers with nvidiots. gamers with brains don't use software upscaling. and software upscaling is not hardware dependent, it will look the same on any hardware
    You're right, gamers would never, ever, turn on a feature that could improve quality or performance, or both.
    It's software, but runs on dedicated hardware.
    Also, I have a feeling that we won't see their neural networks and ai based algorightms on different hardware. Can't imagine why...
    Btw, come up with a better implementation of ai based upscaling, fantastic! Kudos to them!
    Until then, nvidia has a competitive advantage and denying it won't make it disappear.

    Originally posted by pal666 View Post
    lol, most users are using android linux without novideo.
    By the same reasoning most users aren't gamers.

    Originally posted by pal666 View Post
    belief in novideo specialty is what makes you an nvidiot
    Originally posted by pal666 View Post
    because average human is not an nvidiot, unlike you
    You're such an articulate arguer

    Leave a comment:


  • pal666
    replied
    Originally posted by JackLilhammers View Post
    Some Linux users will move away from Nvidia, sure, but it's likely that way more users will move away from Linux.
    lol, most users are using android linux without novideo. because average human is not an nvidiot, unlike you

    Leave a comment:


  • pal666
    replied
    Originally posted by JackLilhammers View Post
    Sure, because gamers don't stream. Ping me when you get back to the Earth. You know, the planet with Twitch...
    gamers don't stream. streamers stream. and on our planet every videocard can stream
    Originally posted by JackLilhammers View Post
    Yes, dlss2 it's software upscaling, but it's awesome and gamers use it.
    you are again confusing gamers with nvidiots. gamers with brains don't use software upscaling. and software upscaling is not hardware dependent, it will look the same on any hardware
    Originally posted by JackLilhammers View Post
    Also, you keep insulting people for no reason at all, but whatever...
    belief in novideo specialty is what makes you an nvidiot
    Last edited by pal666; 26 February 2021, 07:17 PM.

    Leave a comment:


  • qarium
    replied
    Originally posted by oiaohm View Post
    Qaridarium if I am using gamescope. I am not needing AMD Super resolution.
    SteamOS session compositing window manager. Contribute to ValveSoftware/gamescope development by creating an account on GitHub.

    Using gamescope method its possible to be using a custom filter per application from valve that is for the game that is not dependant on what video card drivers I have installed other than not Nvidia at this stable.
    Reality here if raytracing tanks my framerate and I have a AMD card and I am on Linux I don't need AMD super resolution to upscale the output of the game.
    Why does game output scaling in the first place have to be a vendor unique thing there is no need for it to be the case.
    right. but DLSS2 is not about output scaling. linear interpolation output scaling is very boring and does not need any special hardware.
    DLSS2 is about KI upscalling to have better quality than boring linear interpolation output scaling.
    also the AMD super resolution technique is unlike nvidia DLSS2 only a software solution the only hardware what is important to make it fast is the "infinity cache" also AMD super resolution don'T use deep learning KI upscalling but much more simple algorithms.
    and you are right there is no need for vendor lock in walled garden bullshit from Nvidia...
    and you are right on linux with opensource software and wayland there is already solutions for this like you said "gamescope"

    but i don't unterstand why you are Anti AMD here it is 100% clear that the AMD super resolution will be OpenSource to.



    Leave a comment:


  • JackLilhammers
    replied
    Originally posted by oiaohm View Post
    Next CUDA can you in fact proper validate that it doing its maths right? Closed source is doubled sided problem.
    Can you?
    I mean, maybe you can, but most of the times, the users of something do not fully understand the inner working of the something they use and work with.
    You can drive a car without knowing how to fix it.

    Originally posted by oiaohm View Post
    This is also not 100 percent true any more. You do get laptops custom supply with AMD Radeon RX 6700M.
    I don't think it ever was 100%. Let's say the vast majority.
    Also, last time I checked, average users don't buy custom laptops.

    Originally posted by oiaohm View Post
    The reality hear if Nvidia is not going to provide proper Wayland compatible drivers going forwards. Linux users will have to walk away.
    Some Linux users will move away from Nvidia, sure, but it's likely that way more users will move away from Linux.

    Originally posted by oiaohm View Post
    Also please note a lot of the high performance GPU usage in laptops is moving away from dedicated GPU in laptop to external GPU due to cooling issues. External GPU in a box is just a desktop GPU and simple to change.
    Like custom laptops, external gpus are not the norm.
    Last edited by JackLilhammers; 26 February 2021, 09:57 AM.

    Leave a comment:


  • JackLilhammers
    replied
    Originally posted by pal666 View Post
    they are selling points for nvidiots, not for gamers. nvenc has no relation to gaming and isn't better than competition, dlss is software upscaling for imbeciles, current raytracing is "pay to tank both fps and image quality"
    Sure, because gamers don't stream. Ping me when you get back to the Earth. You know, the planet with Twitch...
    Yes, dlss2 it's software upscaling, but it's awesome and gamers use it.
    I'd say you need a little reality check.

    Also, you keep insulting people for no reason at all, but whatever...

    Leave a comment:

Working...
X