Announcement

Collapse
No announcement yet.

NVIDIA, Intel Post New Windows 10 Graphics Drivers For WSL2 Linux App Support

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by omer666 View Post
    They were so lame Nvidia bought them for 55 million dollars in order to use their patents and learn how to make a decent GPU.
    Actually, back then Nvidia pulled a really lousy trick claiming true color support (3dfx only did hi-color), even if their cards didn't actually have the muscle to render at decent frame rates when using true color. Stupid buyers were all like "16M colors must surely be better than 16k colors" and bought mostly Nvidia. That's what put 3dfx under.

    Comment


    • #22
      Aahhh... "developers developers developers..."

      Comment


      • #23
        Originally posted by omer666 View Post

        I really am not convinced at all. Back in the Riva days, 3dfx was the clear winner.
        It's when the Geforce 256 and the Radeon got released supporting T&L (and 3dfx failed to retaliate) that things changed.
        But according to Tim Sweeney himself, 3dfx still had a higher texture rate than Geforce 256 at least, but that's another story.
        Anyway, with Geforce 2 and 3, Nvidia caught up with ATI, but did not quite match the Radeon 8500.
        Then, Geforce 4 got crushed by Radeon 9000 series. That's when they decided to buy 3dfx and release the "FX" series. But poor FX 5800 still got crushed by ATI's Radeon 9800, which was a freaking monster. At that time, Radeon GPUs already had optimised tesselation and FSAA.
        I remember when one of the PC magazines (PC Gamer maybe - it's been years) talked about the Voodoo Banshee - it was the cover story, claiming it was going to be 32-bit this and that, and had AGP, and SBA support and the like. Turned out when the product launched, EVERYTHING that they highlighted on the cover was a lie. NVIDIA had single-chip solutions before 3Dfx. The Riva 128 had anisotropic filtering long before 3Dfx ever did, as well as 32-bit colour (24bpp+transparency). The GF256 had much better throughput overall than the Voodoo cards of the time, with faster RAM and better precision. Also far better power efficiency. And at the risk of countering your unverified statement, the GF256 was up to 50% faster than the Voodoo 3 3500 according to Wikipedia's references. NVIDIA has a history of meeting standard OpenGL support starting with the Riva cards whereas 3Dfx had their custom miniGL-to-GLide implementation that needed special game support.

        Just FYI: the Voodoo 2 was the only card that was faster than NVIDIA chips of the time, but the Riva 128 surpassed it shortly after, and every subsequent NVIDIA card had ever since. Wikipedia has a whole list of references that details that.
        Last edited by Giovanni Fabbro; 17 June 2020, 06:00 PM.

        Comment


        • #24
          Originally posted by omer666 View Post

          I really am not convinced at all. Back in the Riva days, 3dfx was the clear winner.
          It's when the Geforce 256 and the Radeon got released supporting T&L (and 3dfx failed to retaliate) that things changed.
          But according to Tim Sweeney himself, 3dfx still had a higher texture rate than Geforce 256 at least, but that's another story.
          Anyway, with Geforce 2 and 3, Nvidia caught up with ATI, but did not quite match the Radeon 8500.
          Then, Geforce 4 got crushed by Radeon 9000 series. That's when they decided to buy 3dfx and release the "FX" series. But poor FX 5800 still got crushed by ATI's Radeon 9800, which was a freaking monster. At that time, Radeon GPUs already had optimised tesselation and FSAA.
          Yes!! Thank You! I was certain everybody had revisionist memory until your post. At least somebody has a sane memory.

          Comment


          • #25
            Originally posted by Giovanni Fabbro View Post

            I remember when one of the PC magazines (PC Gamer maybe - it's been years) talked about the Voodoo Banshee - it was the cover story, claiming it was going to be 32-bit this and that, and had AGP, and SBA support and the like. Turned out when the product launched, EVERYTHING that they highlighted on the cover was a lie. NVIDIA had single-chip solutions before 3Dfx. The Riva 128 had anisotropic filtering long before 3Dfx ever did, as well as 32-bit colour (24bpp+transparency). The GF256 had much better throughput overall than the Voodoo cards of the time, with faster RAM and better precision. Also far better power efficiency. And at the risk of countering your unverified statement, the GF256 was up to 50% faster than the Voodoo 3 3500 according to Wikipedia's references. NVIDIA has a history of meeting standard OpenGL support starting with the Riva cards whereas 3Dfx had their custom miniGL-to-GLide implementation that needed special game support.

            Just FYI: the Voodoo 2 was the only card that was faster than NVIDIA chips of the time, but the Riva 128 surpassed it shortly after, and every subsequent NVIDIA card had ever since. Wikipedia has a whole list of references that details that.
            Calling any of nVidia's OpenGL "standard" is flat out wrong. It's just wrong.

            Comment


            • #26
              Originally posted by Giovanni Fabbro View Post

              I remember when one of the PC magazines (PC Gamer maybe - it's been years) talked about the Voodoo Banshee - it was the cover story, claiming it was going to be 32-bit this and that, and had AGP, and SBA support and the like. Turned out when the product launched, EVERYTHING that they highlighted on the cover was a lie. NVIDIA had single-chip solutions before 3Dfx. The Riva 128 had anisotropic filtering long before 3Dfx ever did, as well as 32-bit colour (24bpp+transparency). The GF256 had much better throughput overall than the Voodoo cards of the time, with faster RAM and better precision. Also far better power efficiency. And at the risk of countering your unverified statement, the GF256 was up to 50% faster than the Voodoo 3 3500 according to Wikipedia's references. NVIDIA has a history of meeting standard OpenGL support starting with the Riva cards whereas 3Dfx had their custom miniGL-to-GLide implementation that needed special game support.
              Well, my memory might trick me a bit, but I was pretty sure the Riva 128 wasn't up to Voodoo 2 performance. If we are only talking of single card performance, it's true that Nvidia was taking the lead with the TNT, but SLI allowed 3dfx to still beat them in many cases.
              OpenGL performance from Nvidia has always been stellar, no doubt about that. But at the time, many more games used Glide.
              Last edited by omer666; 17 June 2020, 06:12 PM.

              Comment


              • #27
                Originally posted by duby229 View Post

                Calling any of nVidia's OpenGL "standard" is flat out wrong. It's just wrong.
                Back when OpenGL was just a pipe-dream, yes, NVIDIA had the most complete hardware support for it. 3Dfx, on the contrary, required GLide ports to "miniGL". OpenGL prior to that was mostly relegated to CPU-only implementations, and only professional graphics applications. Before Quake, there wasn't really any 3D hardware rendering in games.

                Comment


                • #28
                  Originally posted by omer666 View Post

                  Well, my memory might trick me a bit, but I was pretty sure the Riva 128 wasn't up to Voodoo 2 performance. If we are only talking of single card performance, it's true that Nvidia was taking the lead with the TNT, but SLI allowed 3dfx to still beat them in many cases.
                  OpenGL performance from Nvidia has always been stellar, no doubt about that. But at the time, many more games used Glide.
                  Ya the Voodoo 2 was the only one - I remembered that and added it to the original comment, but it wasn't the top card for long. Quake supported hardware OpenGL and 32-bit textures via GLquake pretty much from the start, so once the R128 came out, the Voodoo2 was done. 3Dfx just couldn't keep up after that, and then with their hardware limitations putting their cards several generations behind the competition, it was a company that had a very short heyday. ATI wasn't really a player until 3Dfx dropped off the map. Their Rage 128-based stuff was just plain junk in comparison to 3Dfx and NVIDIA, as was Rendition and Matrox.

                  I actually think that NVIDIA bought 3Dfx primarily for the patents on SLI, just so they would be able to force ATI to pay royalties to license it from them. I'm not sure that ATI ever had to, but no doubt they would've had to spend a whopping amount of R&D on Crossfire in order to compete.

                  Comment


                  • #29
                    Originally posted by Giovanni Fabbro View Post

                    Back when OpenGL was just a pipe-dream, yes, NVIDIA had the most complete hardware support for it. 3Dfx, on the contrary, required GLide ports to "miniGL". OpenGL prior to that was mostly relegated to CPU-only implementations, and only professional graphics applications. Before Quake, there wasn't really any 3D hardware rendering in games.
                    But I think you are failing to mention that the early versions of OpenGL was intended for things like CAD. It wasn't until -much- later that it was optimized for gaming. And I'm at least reasonably certain GLide support in game came first.

                    Comment


                    • #30
                      Originally posted by Giovanni Fabbro View Post

                      Ya the Voodoo 2 was the only one - I remembered that and added it to the original comment, but it wasn't the top card for long. Quake supported hardware OpenGL and 32-bit textures via GLquake pretty much from the start, so once the R128 came out, the Voodoo2 was done. 3Dfx just couldn't keep up after that, and then with their hardware limitations putting their cards several generations behind the competition, it was a company that had a very short heyday. ATI wasn't really a player until 3Dfx dropped off the map. Their Rage 128-based stuff was just plain junk in comparison to 3Dfx and NVIDIA, as was Rendition and Matrox.

                      I actually think that NVIDIA bought 3Dfx primarily for the patents on SLI, just so they would be able to force ATI to pay royalties to license it from them. I'm not sure that ATI ever had to, but no doubt they would've had to spend a whopping amount of R&D on Crossfire in order to compete.
                      Revisionist crap....

                      Comment

                      Working...
                      X