Announcement

Collapse
No announcement yet.

EVGA - Long-Time NVIDIA Partner - Ending Graphics Card Production

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by mdedetrich View Post
    While NVidia has a history of making products its also disingenuous to ignore their dark side. I have read from numerous sources that NVidia is a complete shit company to deal with (i.e. they can be quite abusive) where as AMD tends to be historically more incompetent/doesn't have resources at least up until recently.
    You don't get as big as NVidia without trampling on some people. If you watch the NexusGamers video and believe EVGA said, its not even about the financials (which you are insinuating), its actually about respect. I mean there are also a bucketload of things that NVidia is doing even aside of AIB's, i.e. with the massive recent power consumption increase of their new 4000 series they are pushing that entire problem onto PSU manufacturers. Seasonic has reportedly said that the new 4000 series is going to have power spikes of 1k+ watts which normally trips the PSU into shutting down due to potential short circuit but that is now considered "normal" by NVidia and its up to PSU people to solve. Also do note that EVGA is a sole partner with NVidia and GPU's represented ~80% of their business, so the fact that EVGA is doing this is really telling compared to someone else like ASUS. They basically decimated their own business because of how bad it was dealing with NVidia so thats saying something. Its also pretty public by now that NVidia's CEO doesn't think that board partners provide any value (they would ideally like to act like Apple to control the whole supply chain, but they cant) which goes back to the respect part.
    Thats the general gist of of what I got from NVidia, is that they expect everyone else to solve the problems they create which historically they have gotten away with because they are so big, things are changing slowly with real competition from AMD but this takes time. Similar reasons is why Apple moved away from NVidia, for one specific generation NVidia really screwed up with their GPU's and they tried to offload all of the problems onto Apple and Apple at the time was already large enough to say "fuk it, we don't want to deal with this shit anymore".
    it looks like RDNA3/ radeon 7900XT beats the shit out of nvidia in 5nm with 256mb infinity cache and 756mb stagged 3D cache and up to 24GB of vram.. and up to 3,5ghz clock speed with like 100TFLOPS of caclulation power and also matrix/AI cores to calculate FSR3.0(similar to DLSS2.x)
    and this @450watt TDP ... and Nvidia try to go to 800watt 'TDP...
    Phantom circuit Sequence Reducer Dyslexia

    Comment


    • Originally posted by theriddick View Post
      FSR1 was terrible, FSR2.0 had artifacting, FSR2.1 is ok now but later DLSS methods still outpace it. FSR2 is limited to not as many games atm and has had less time to bake in the oven.

      from FSR2.1 to DLSS2.x the only technical advantage nvidia has is the AI/Matrix/FMA cores who calulcate it and FSR2.1 consume shader calculating power on the same hardware this makes a performance difference of ~7%...

      Nvidia lose this advantage with RDNA3/7900xt because then amd also has FSR3.0 calculated by the AI/Matrix/FMA cores.

      "FSR2 is limited to not as many games atm"

      this is always right because nvidia has some kind of monopole.

      Originally posted by theriddick View Post
      Ray Tracing is now in MANY games and works quite well with RTX cards coupled with DLSS2.x. It seems you've just not played anything with decent RT support. I really don't need to advertise it, plenty of videos on YT showing RT on/off and its worth+use...
      i am sure he means something very different. there are some kind of real life studies on people if they see the different between RT on or off and most people can not see it and only the very best experts in the group can see the difference.

      this means the difference of this effect is so small that most people dont need it.

      and the different is also so small because the engines only use it for stuff without big calculation needs means effects who would bring a big difference but are costly in calculation power are not implemented yet.

      just make a science study 1000 people sit infront of a KVM switch and switch between RT and non-RT game version and watch the result.

      most people can not say what version is the RT version.
      Phantom circuit Sequence Reducer Dyslexia

      Comment


      • Originally posted by coder View Post
        Actually, some people are pretty good at estimating the BOM of a GPU. I remember when Vega launched and some very knowledgeable outsiders figured they must've been selling them either at cost or at a slight loss, because they under-performed and therefore didn't slot into the intended price bracket. Crypto eventually saved the day, but that debacle is why I thought Koduri's departure from AMD might've been a "mutual decision".
        I wish I knew what was the price floor on a 3080 (uses the same chip as a 3090), because I'd buy one if they go low enough. Don't need it, but it's times like these I generally try to do my upgrades.
        i know vega64 was a under-performaner at release date but 4-5 years after release i can say it aged well because compared with FSR the performance of the vega64 is good...
        i tested it with many titles like cyberpunk or ashes of singularity and other titles @ 4K 60hz

        without FSR the vega64 has to many shader power not used because the other parts of the chip like texture units are not big enough to fill the pipelines ... but with FSR the unused shaders are used and the result is good.

        also at the release 2017 outside of mining the compute function had no big use and today you can use Blender 3.3 with ROCm/HIP..

        about the price floor for a 3080 or other highend cards like 6900 i am pretty sure it is the lowest price point now EVGA of course also only did this annaunchment because the GPU market for high prices products are collapsing right now.
        Phantom circuit Sequence Reducer Dyslexia

        Comment


        • Originally posted by user1 View Post
          After my last discussion with him, I'm now convinced more than ever that he's the biggest Nvidia fanboy and shill I've ever met. He goes on and makes an excuse to pretty much every single action Nvidia does, even when the action is obviously wrong and anti competitive for any person with common sense. And then he tries to prove as hard as humanly possible that he "does not defend Nvidia". It's an ugly circus of cognitive dissonance and mental gymnastics.
          Btw, Birdie, I know you BL'd me, but if you're reading this, I researched that HWU controversy a bit more, and it doesn't seem HWU even broke the review guidlines in that specific review. After the backlash from the community and other tech reviewers, Nvidia even made a U turn regarding their decision to ban HWU from sending GPU's and wrote them an apology letter.
          So next time, why don't you at least research stuff a bit better if you want to shill for your beloved corp. Otherwise stop lying and STFU.
          this ugly circus of cognitive dissonance of birdie is not only about nvidia he defence everything what is evil: Intel, Microsoft, Nvidia...

          he supports everything what is a monopole... he only dislike apple because apple has not a monopole and also apple is a danger to intel and microsoft and nvidia...

          the joke is about him he hates to lose because of this he bans everyone who become a danger to him and make him lose i am banned to.

          i think he has the longest ban list a user ever had on phoronix.com..
          Phantom circuit Sequence Reducer Dyslexia

          Comment


          • Originally posted by drakonas777 View Post
            He is definitely biased towards NVIDIA and Intel. He might be a fanboy, but from my subjective point of view I think he is angry at AMD because AMD is not as budget and cost effective as it used to be, he despises the idea of AMD becoming a premium brand I guess LOL
            he also claims he use linux 99% of the time but always do defence microsoft actions and he loves windows.
            Phantom circuit Sequence Reducer Dyslexia

            Comment


            • Originally posted by theriddick View Post
              FSR1 was terrible, FSR2.0 had artifacting, FSR2.1 is ok now but later DLSS methods still outpace it. FSR2 is limited to not as many games atm and has had less time to bake in the oven.
              Under Linux FSR does not require title to support to use FSR thanks to valve steamdeck and gamescope. Fairly much if you have anti-cheat problem under Linux using FSR the game was not going to work under Linux anyhow.
              https://www.ryzencpu.com/2022/02/ste...r-amd-fsr.html. Yes the title support FSR itself can work out processing lighter than the gamescope route.

              There are methods to force FSR 2.1 on programs that don't support FSR under Windows as well. This does have the problem that game anti-cheat systems can hate you so ban you because you used something to make the game use FSR under Windows. This is advantage of the Valve route is third party(not game developer) providing scaling solutions to games is part of the steamdeck platform that anti-cheat systems supporting the steamdeck are expected to accept if they want extra valve marketing. Windows the third party scaling is not part of the platform so game anti-cheat system can stuff you around and there is no marketing advantage to game developers for supporting third party scaling on Windows.
              Last edited by oiaohm; 18 September 2022, 10:15 PM.

              Comment


              • Originally posted by qarium View Post
                this means the difference of this effect is so small that most people dont need it.
                Yes some games use very little power from RT. But it depends on the game, for example Control used reflections well because it was mostly indoor where the player could easily see the benefits. Also the benefits shown in the latest Metro remaster are good because it uses it for global illumination.

                Comment


                • Originally posted by mdedetrich View Post
                  Another video from MLID (who is a well known and accurate leaker who has contacts in all major chip companies including NVidia) on the EVGA exit https://youtu.be/5uK_VUxi5Zo?t=1734.

                  Apparently EVGA had close to 40% of the NVidia GPU market share in North America, this is quite massive if so.
                  A known liar who makes stuff up all the time and loves to delete his old videos which turn out to be blatantly false more often than not. An "insider" with no insider knowledge.

                  A laughing stock on r/AMD r/Intel and r/NVIDIA.

                  I knew people on Phoronix have zero information about this whole situation and it's been proven beyond a reasonable doubt. 11 pages of wild speculation and baseless accusations from absolute no ones who have never started a single successful business. Meanwhile NVIDIA is worth $330 billion.

                  Yeah, users of non-existing pseudo-OS bark at something which is a million times bigger than all their lives combined. LMAO. What a cringe.

                  Comment


                  • Originally posted by birdie View Post

                    A known liar who makes stuff up all the time and loves to delete his old videos which turn out to be blatantly false more often than not. An "insider" with no insider knowledge.

                    A laughing stock on r/AMD r/Intel and r/NVIDIA.

                    I knew people on Phoronix have zero information about this whole situation and it's been proven beyond a reasonable doubt. 11 pages of wild speculation and baseless accusations from absolute no ones who have never started a single successful business. Meanwhile NVIDIA is worth $330 billion.

                    Yeah, users of non-existing pseudo-OS bark at something which is a million times bigger than all their lives combined. LMAO. What a cringe.
                    Well to be frank, you are asking for something that is impossible to provide and only someone who has zero clue about how business works would say such a stupid thing. Intel was doing the exact same anti competitive tactics and there was no evidence beyond reasonable doubt until they got taken to court where are people are forced to say things under oath and they cannot lie or choose to say nothing. Main point is that it was obvious Intel was doing these things before they got taken to court, but no one at the time would ever go on record to say such a thing unless it was in a court room because in doing so they would harm themselves (thats how mafia tactics work, you get put into a catch 22 that you cannot get out of).

                    In any case since its clear you are a bit of an NVidia fanboy, NVidia losing their golden boy (EVGA was NVidia's best AIB with best warranty and highest market share in western markets) means that, to be frank, NVidia royally fuked up. NVidia losing some no name Chinese AIB, not a big issue. Losing your best AIB? I am sorry, those kinds of things only happen when at minimum both parties fuk up, not saying EVGA is totally innocent however claiming that NVidia did nothing wrong is extremely naive and is the kind of thinking that comes from people that will constantly defend something even when its become quite clear its not defendable.
                    Last edited by mdedetrich; 19 September 2022, 07:02 AM.

                    Comment


                    • Originally posted by birdie View Post

                      A known liar who makes stuff up all the time and loves to delete his old videos which turn out to be blatantly false more often than not. An "insider" with no insider knowledge.

                      A laughing stock on r/AMD r/Intel and r/NVIDIA.

                      I knew people on Phoronix have zero information about this whole situation and it's been proven beyond a reasonable doubt. 11 pages of wild speculation and baseless accusations from absolute no ones who have never started a single successful business. Meanwhile NVIDIA is worth $330 billion.

                      Yeah, users of non-existing pseudo-OS bark at something which is a million times bigger than all their lives combined. LMAO. What a cringe.
                      Except the fact that absolute majority of his leaks and speculations came to be true and he is one of the best if not the best leaker today. So yeah, cry a river now LOL

                      You don't have to be a successful in business or software development to make a good argument. Regular illogical nonsense from you.

                      Comment

                      Working...
                      X