Announcement

Collapse
No announcement yet.

Valve Working On Explicit Sync Support For "NVK" NVIDIA Vulkan Driver

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #81
    Originally posted by mSparks View Post
    exclusives are the software developers choice.
    I can quite imagine nvidia gouging nintendo to the point they move to AMD, nvidia will be well aware they are the primary reason for the switches success, and they are as ruthless as they are smart.
    Question then is, who will nvidia give the exclusive access to the worlds best GPU software instead, Valve/steamos seems like a very likely contender.
    RDNA3 had 17% higher raytracing performance compared to RDNA2 and RDNA4 has 40% higher raytracing performance compared to RDNA3...

    so what is the benefit of nvidia hardware for valve ? keep in mind FSR4.0 will use the matric ai acceleration cores on RDNA3/RDNA4.
    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • #82
      Originally posted by Gusar View Post
      Hell no. The Switch is successful cos of Nintendo's IP - Mario, Zelda, Pokemon - and a whole bunch of simpler looking platformers/metroidvanias/pixel-look RPGs, aka games people like to play on the go.
      There's the occasional big game like Witcher 3, Nier Automata or Crisis Core, but big sacrifices needed to be made to get them running on the Switch in terms of texture quality, resolution or overall art style (Witcher replaces the realistic environment on other platforms with a paintery/fairytale-y look on the Switch)
      Pokemon sells because it's Pokemon, not because it runs on a Tegra. In fact Scarlet/Violet were criticized a lot for their crappy look and poor performance. They're awesome gameplay-wise though, I've spent 92 hours in Violet.
      Really, performance is the least responsible for the Switch's succes, no one cares that it runs a Tegra. Others are entirely correct when they say the Switch was considered weak when it was new and is now considered super weak. Basically, the Switch is succesfull *despite* its hardware, not because of it.
      ​​
      H100 is enterprise-grade AI hardware. Bringing it up in a discussion about handhelds is... basically nuts. It's two completely different worlds.
      you are absolutly right... and mSparks most likly has stock market shares of Nvidia he will never admit this

      he claims nvidia is superior because of raytracing performance but well RDNA4 has 40% higher raytracing performance than RDNA3... he will never admit that this litterally makes nvidia obsolete for gamers.
      Phantom circuit Sequence Reducer Dyslexia

      Comment


      • #83
        Originally posted by illwieckz View Post
        That's the exact proof Nvidia Tegra cannot really compete, and that it is a solution you should not look at by default, only as second option if your need is specific, if your need is a niche.
        On gaming consoles AMD is king since forever, they are very hard to beat.
        To beat AMD on gaming consoles, a maker should tailer a solution for a niche, which is what Nvidia did with the Tegra.
        All powerful current consoles (Xbox Series, PS5) are using AMD APUs.
        Previous generations (Xbox One, PS4) wer already using AMD APUs.
        In older generations Xbox 360 already had an ATI Radeon GPU.
        On Nintendo side, Wii U had an ATI Radeon, even the GameCube had an ATI Radeon.
        That story goes from very long time ago. ATI had the experience, AMD bough ATI for their Fusion project, the Fusion project being the project of making those APUs.
        AMD is working on this APU move on purpose since 20 years.
        The exceptions? The first Xbox had an Nvidia, why? Because ATI attempted a bluff poker move by putting an inflated price to milk Microsoft and lost the bet. The PS3 had an Nvidia, I acknowledge I don't know why (Nvidia isn't that bad anyway), then happened the Switch.
        Let's talk about the Switch. This is more a tablet-range device than other consoles, and then a tablet-range hardware fits more its need. At the time Nintendo was working on the Switch, it happened that Nvidia was right in the spot with a product ready for that with its ARM-based Tegra solution.

        Nvidia in gaming consoles are just about fitting some specific niches at specific time like the Nintendo Switch, or very exotic hardwares like the PS3, or funny commercial moves like the first Xbox.
        For making a game console today one should first look at AMD by default, then look if by chance some other brand would not have, by luck, a niche product that would fit better the specific need. It may happen, it happened multiple time, but this is the choice that comes when you're defining your product in more details in a way it becomes a niche that may fit an Nvidia niche by luck.
        Both Nvidia and Intel are late on the topic. Intel is closing the gap on the graphics side with their Arc solutions, Nvidia is closing the gap on CPU side in their Tegra solutions, but they are the late competitors​ on the topic. On console gaming side, the only chances of Nvidia and Intel were and are niche products or specific commercial moves, technically the Switch and the first Xbox.
        Valve is a PC-gaming company, like Microsoft was a PC-software company when they released their first Xbox. For their first console Microsoft went with an OS derivated from PC they well known (NT), and PC-like hardware, with Intel Pentium CPU and Nvidia Geforce GPU (that could have been an ATI one). Valve is doing exactly the same as Microsoft for their Steam hardware: Well known PC OS (Linux), PC component (AMD APU). So if Valve would go for another brand than AMD, there is more chance that Valve would go for an Intel CPU with integrated Arc. Integrating Nvidia with an AMD or Intel CPU would be a waste effort (and then money), so since Nvidia has no Intel-compatible CPU, this cannot be an option.
        Even confidential products like the Atari VCS went the AMD APU way. Because AMD is king on the topic of making a game console, it is the default choice, they are the APU maker for 20 years, they are the Fusion project dudes, this is their strategy. AMD should be everyone first choice when making a game console, then when defining the product and the niche, some others like Nvidia may have their luck, but only in a second step.
        The only situation where Nvidia stands a chance in a Valve context is a product with a removable GPU, basically the Steam Box line they tried to build-up at first, and that they may revive one day. But Tegra is not a dedicated GPU to plug next to separate CPU, Tegra is not a solution for a Steam Box.
        Both Intel and AMD can be options for a Steam deck or a Steam box, both as APUs, or as CPUs, or as GPUs.
        Nvidia Tegra can't be an option for a Steam deck or a Steam box.
        Non-Tegra Nvidia can only be an option for a GPU of a Steam box, with an AMD or Intel CPUs.
        If Valve is planning some Nvidia-based product, this can't be based on Tegra, unless Nvidia partners with Intel for the CPU cores… But because Intel are seriously attempting at competiting with Nvidia on the performance graphics chip the odds of having a mixed Intel + Nvidia APUs are very very very low. Intel has partnered with AMD for embedded graphics in their CPUs once, so one should never say never but I strongly doubt Valve is in position to force Intel to let Nvidia embed an Intel CPU in their Tegras, really.
        So basically, if your product is not a Switch, the Tegra is not for you. Valve has no reason to make something lighter than a Steam deck. The only option for Valve to put an Nvidia in their product is a dedicated GPU in a product larger than a Steam deck, basically a Steam box. A laptop line may also work for an Nvidia GPU in a Valve product (basically a mobile Steam box).
        a laptop with dedicated GPU is in fact death with AMD Strix Halo Mega-APU with 2.560 Shader GPU
        Phantom circuit Sequence Reducer Dyslexia

        Comment


        • #84
          Originally posted by qarium View Post

          you are absolutly right... and mSparks most likly has stock market shares of Nvidia he will never admit this

          he claims nvidia is superior because of raytracing performance but well RDNA4 has 40% higher raytracing performance than RDNA3... he will never admit that this litterally makes nvidia obsolete for gamers.
          Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.


          Experience the critically acclaimed and award-winning Portal™ reimagined with ray tracing in this free DLC for Portal owners. Start thinking with portals, with RTX on.

          Comment


          • #85
            Originally posted by mSparks View Post

            I'm not sure what point you think you are making, but it is certainly no longer that nvidia GPUs cant compete with AMD.

            it almost seems to be sinking in that AMD finally being able today with the nvidia chip in the 2017 switch doesnt make AMD competitive.
            My point is that saying "AMD isn't usable or competitive" is simply not true.
            The problem stems from Cycles benchmarks that were sometimes made well over a decade ago, as Blender stores the render settings in the project file; new features aren't automatically enabled when starting up older projects, this is also true for Cycles benchmarks meaning that most benchmarks run with settings that are no longer the default for a new project and as a result don't have settings (like OIDN and adaptive sampling) enabled that would benefit the render time a lot but aren't depended on the samples/min that opendata measures.

            Opendata in and of itself has also always been questionable, the samples/min is an estimated value and sometimes differs wildly even on the same card, the RTX 4090 scoring between 12213.68 and 6034.64 (HALF!) and no not the laptop version (but it is down there): ​
            Screenshot from 2024-05-11 08-55-50.png
            Not to mention as said before the estimated samples/min is never indicative of actual render speeds , if Opendata is to be believed the RTX 4070 TI (non super) would have around 57% the performance of an RTX 4090, but in nearly every actual render benchmark the RTX 4070 ti only renders about 30% slower compared to it.
            And again, there are multiple cases where older cards beating their TI or Super variants. Even with Blender 4.1 you can find the RTX 2080 (2329.41) out performing the 2080 Super (2305.92), as stated before with the Blender 4.0 Opendata benchmarks this was the case for the RTX 3090 VS 3090TI and the 4080 vs the 4080 Super, which never makes sense as while it is the same silicon the TI and Super variants tend to have more shaders units, tensor cores and rt cores and as a result always a bit more performance (but in Opendata they somehow have less even with over 200 benchmarks done for each).

            And again this is all outside of the fact that Cycles performance isn't everything nor does it carry over to the other 2 render engines in Blender like Eevee (the default render engine) and Workbench (render engine used by the viewport). There exist a ton of use cases for Blender that don't have to rely on Cycles at all like 3D sculpting/game character design and 3d printing design. Other use cases like still renders and animation renders can be done in Eevee and much faster (and tends to be recommended to be done with Eevee as well), with Eevee-next possibly replacing Cycles for even more projects. And as linked before the AMD Eevee performance is about what you would expect for the price, all the while you need to worry less about VRAM.

            That isn't to say AMD is a clear winner, more that depending on your use case, AMD might be a better option. Someone who makes miniatures for 3D printing is better of with AMD due to the much higher price/performance in the viewport and not being reliant on Cycles performance at all (which to reiterate is what most tech reviewers and Opendata measures). Someone who does Archviz using Cycles would be better of with Nvidia. The focus on purely Cycles benchmarks doesn't give the whole picture and would understandably make someone think that Blender only benefits from Nvidia but the truth is that it depends on your use case inside of Blender.

            Comment


            • #86
              Originally posted by tenchrio View Post
              My point is that saying "AMD isn't usable or competitive" is simply not true.
              And no one said any such thing afaik.
              The PS5 and xbox, both AMD, are competing fairly well now...

              against the nintendo swtich...

              nvidia silicon from 2017...

              However there is a huge difference between "cant compete" and "hasnt been trying."

              Just because Mike Tyson hasnt been in a boxing match for a very long time doesnt mean he wouldnt kick ur ass with a single flick of his finger.

              You only need to compare nvidia from 2017 with nvidia in 2024 to know what would be the outcome of a fight between ps5/xbox and mass market gaming device based on >2024 nvidia.


              Comment


              • #87
                Originally posted by mSparks View Post

                And no one said any such thing afaik.
                The PS5 and xbox, both AMD, are competing fairly well now...

                against the nintendo swtich...

                nvidia silicon from 2017...

                However there is a huge difference between "cant compete" and "hasnt been trying."

                Just because Mike Tyson hasnt been in a boxing match for a very long time doesnt mean he wouldnt kick ur ass with a single flick of his finger.

                You only need to compare nvidia from 2017 with nvidia in 2024 to know what would be the outcome of a fight between ps5/xbox and mass market gaming device based on >2024 nvidia.
                Isn't the Switch's success more on Nintendo?
                Nintendo has always been able to make consoles that have weaker hardware out sell their competitors.
                The Nintendo Wii also outsold the PS3 and Xbox 360 and it didn't even have Nvidia hardware.

                The raw performance never mattered in the console race, consoles with better hardware lose all the time either due to a lack of good games or simply being a lot more expensive for the consumer. And in current gen a lot of games for the PS5 and Xbox are being ported to PC, with PS5 still having an actual use case as the games arrive there earlier and it still having a lot of older PS4 exclusives that it can run all, while Microsoft is closing game studios, the Switch meanwhile has a ton of exclusives and is a portable console, it's a bit of an apples and oranges comparison.

                Nvidia made their own set of Shield tablets (all with the earlier iterations of the Tegra chips) and the Nvidia Shield TV which uses the same Tegra X1 chip as the Nintendo switch. None of them being as much as a success as the Nintendo Switch itself (despite sporting similar hardware and being able to easily emulate older Nintendo games including the Wii) and mind you these were released before the Nintendo Switch and are marketed by Nvidia themselves.

                They actually did some pretty amazing things for it like porting Half Life 2 and Portal to Android just for the Shield but as far as I can find, no sales numbers are reported on the Nvidia shield products, the Shield Tablets didn't get a refresh after K1 in 2015 (makes sense as they would have to compete with the Switch in 2017 and they weren't exactly a huge success to begin with) and the Shield TV hasn't had a refresh since 2019 (it was first released 2015, got a refresh in 2017 and then 2019 but hasn't since) which again was the first device to hit the market with the Tegra X1, the Switch2 has been announced but no hardware yet, if Nvidia had a chip then why not use it for their Shield TV products like they did before?

                So yeah Nvidia tried and Nintendo took their 2 year old hardware (again the 2015 Shield TV already had a Tegra X1) and made it into a best selling game console.

                The Tegra chips are quite exciting as they are ARM chips with strong GPU performance, but you are overestimating the role Nvidia plays in the Switch's success. If it weren't for Nintendo the hardware wouldn't sell nearly as much which is already evident from the past when Nvidia tried a hand at the market themselves. If AMD made the next Switch I doubt anyone would care about the hardware switch, just about every one of the Nintendo handhelds have had great sales numbers even if they were off to a rocky start at first (e.g. the 3DS).

                Comment


                • #88
                  Originally posted by tenchrio View Post
                  Isn't the Switch's success more on Nintendo?
                  I'd say its 100% based on Nintendo and their choices.
                  Their latest choice was to partner with nvidia, and that kept them competitive through 2 generations of offerings from Sony and Microsoft, absolutely demolishing the PS4.

                  Comment


                  • #89
                    Originally posted by mSparks View Post

                    I'd say its 100% based on Nintendo and their choices.
                    Their latest choice was to partner with nvidia, and that kept them competitive through 2 generations of offerings from Sony and Microsoft, absolutely demolishing the PS4.
                    Whereas I'd say it had nothing to do with the GPU vendor chosen and everything to do with it being a uniquely designed dockable portable with detachable controls for on-the-go local multiplayer and a big library of exclusive titles including franchises that are especially well-suited to local multiplayer like Super Smash Bros.

                    Comment


                    • #90
                      Originally posted by ssokolow View Post
                      exclusive titles including franchises that are especially well-suited to local multiplayer like Super Smash Bros.
                      Which only exist successfully because Nintendo's developer kit is the best, vast majority of that is written by Nvidia.
                      AMDs developer kit is at least as buggy for consoles as it is for PC, with Sony and Microsoft having to do most of the work.

                      Comment

                      Working...
                      X