Announcement

Collapse
No announcement yet.

EVGA - Long-Time NVIDIA Partner - Ending Graphics Card Production

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by mdedetrich View Post

    Well to be frank, you are asking for something that is impossible to provide and only someone who has zero clue about how business works would say such a stupid thing. Intel was doing the exact same anti competitive tactics and there was no evidence beyond reasonable doubt until they got taken to court where are people are forced to say things under oath and they cannot lie or choose to say nothing.

    In any case since its clear you are a bit of an NVidia fanboy, NVidia losing their golden boy (EVGA was NVidia's best AIB with best warranty and highest market share in western markets) means that, to be frank, NVidia royally fuked up. NVidia losing some no name Chinese AIB, not a big issue. Losing your best AIB? I am sorry, those kinds of things only happen when both parties fuk, not saying EVGA is totally innocent however claiming that NVidia did nothing wrong is extremely naive and is the kind of thinking that comes from people that will constantly defend something even when its become quite clear its not defendable.
    This news piece is not about anti-competitive tactics. Are you alright, mate? Or you just here to lambast NVIDIA because Intel was anti-competitive in the early 00s? How does it even relate to NVIDIA??

    I never said NVIDIA had done nothing wrong. And like I've said a dozen times already it's beyond weird that out of all NVIDIA partners it's only EVGA which is not profitable. Can you stop making stuff up for Christ's sake? If you know nothing it's pertinent to STFU not join the chorus of haters.

    Also please do me a favour and rewatch the goddamn video. It made perfectly clear that it is a single person decision based on very dubious grounds. End of bloody story. This is not how you run business.

    The day when ASUS, MSI, Gigabyte, Palit and any other NVIDIA partner comes and officially confirms NVIDIA has ripped them off in terms of 3080/3090 chips/GDDR RAM modules pricing is the day you can start shitting on the company. OK, amigo? I love to work with proven facts. Phoronix and r/Linux users on the other hand love to work with feelings. Look where it's taken Linux in the past 30 years. No-fucking-where. A set of semi-broken packages which people love to call an "operating system". It is not.

    And God I'm so bloody tired of hearing absolute no-ones saying I defend Intel NVIDIA and Microsoft. Too effing bad not a single one has ever showed a single post of mine where I did this shit.

    If NVIDIA is ready to drop EVGA it means the company will be totally fine. Trust me.

    Comment


    • A nobody complains about comments from nobodies LOL

      Comment


      • Originally posted by birdie View Post
        I never said NVIDIA had done nothing wrong. And like I've said a dozen times already it's beyond weird that out of all NVIDIA partners it's only EVGA which is not profitable. Can you stop making stuff up for Christ's sake? If you know nothing it's pertinent to STFU not join the chorus of haters.
        Dude are you dense? It was already posted multiple times in this thread why it uniquely effected EVGA, let me repeat for you since you have issues reading
        • EVGA ONLY sells NVidia GPU's, they do not sell AMD GPU's. This means unlike all of the other AIB's, they cannot easily absorb/reuse costs. For example since the 3000 series had such ridiculous power/thermal usage, AIB's had to redesign coolers/blowers/boards to handle that. If you make both AMD and NVidia GPU's, you can reuse that effort on your AMD cards to help lower the cost. EVGA couldn't do that, they only sold NVidia GPU's.
        • EVGA had the best warranty by far out of any of the AIB's. This is expensive and costs them money especially if NVidia happens to release a GPU generation that has problems which the 3000 series had (in the early days there were lots of problems with power delivery due to how much the power 3000's series takes).
        • Most importantly, EVGA makes almost no money from NVidia GPU's. 70% of their revenue is GPU's but they make more profit from selling PSU's which is only ~25% of the revenue. That already shows you how lopsided their profit margins are.
        If I am in EVGA's situation where the profit margins for graphics is so thin (reportedly ~5%), I am not that big of a company (unlike ASUS) which means I can't spread my costs so easily and NVidia is a pain in the ass to deal with because I only get actual pricing/drivers for the product I am building last minute and I have to compete a losing battle against founders edition cards why WOULDN'T I exit?

        Your entire premise relies on the fact that EVGA is stupid/retarded which makes zero sense considering they have been in the business for decades.

        Comment


        • Originally posted by birdie View Post
          I never said NVIDIA had done nothing wrong. And like I've said a dozen times already it's beyond weird that out of all NVIDIA partners it's only EVGA which is not profitable. Can you stop making stuff up for Christ's sake? If you know nothing it's pertinent to STFU not join the chorus of haters.
          ​Sorry you are making mistake here.

          Originally posted by birdie View Post
          The day when ASUS, MSI, Gigabyte, Palit and any other NVIDIA partner comes and officially confirms NVIDIA has ripped them off in terms of 3080/3090 chips/GDDR RAM modules pricing is the day you can start shitting on the company. OK, amigo?
          You presumed you should have gone and looked at the stock market reporting some of those vendors give their GPU division gross profit margins it averages to less than 20 percent and some years go into negative(as in loss). Officially confirming in anything other than share market balance sheet reporting that Nvidia is ripping you off had bad adverse effects. A company in the past happened to report in a statement that Nvidia was ripping them off the result was the company got no more Nvidia silicon and they moved across to making AMD cards Please note they officially confirmed it in a share holder meeting under direct questioning that was made into a statement as required by the trading rules it was not straight up leak to public and Nvidia classed that as breach of the NDA. So the only companies that can officially confirm Nvidia is screwing them directly are companies that have made up mind not to be selling Nvidia products any more. Other than that you have to look at share-market financial reporting to see who it taking what.

          Gigibyte, MSI, ASUS... hope to sell you a powersupply/motherboard/something with that GPU. The reality they sell you 500 dollar motherboard with a 2000 dollar video card they made more money on the motherboard. Mid range graphics cards they can be making more money selling you a power-supply that is made by someone else that they have rebranded.

          Its horrible but lot of Nvidia AIB are treating the Nvidia grus as loss leaders to sell something and even if they sell you something else if Nvidia alters the MSRP they can be behind. Remember Nvidia sets the base MSRP for all AIB and goes and even sets how much markup off that is allowed that was documented in that one shareholder meeting we have no reason to think that has changed.

          Comment


          • Originally posted by Quackdoc View Post

            you have absolutely no idea what you are talking about...

            first of all, I don't care what you think about hevc. I care about what I need. we don't all live in a world where what someone thinks bears relevance against reality

            second of all, Nvidia is supporting their 4 year old cards perfectly fine, and so is intel, AMD is the outlier here.

            third of all when RHEL comes out and says they will officially support polaris in a sufficient matter, then it will be fine. as I said, an official statement is absolutely necessary, I live in the real world, not one where a commit here and there matter.

            forth polaris STILL doesn't support VK_EXT_image_drm_format_modifier on either radv or amdvlk. there were a few other extensions if I am remembering right, that other cards of the era support.

            fifth It's not.

            sixth there are plenty of other gripes from AMD, things like how long it took to get raytracing on linux, overall lack of compute ecosystem, a lower preformance celing, I still have power efficiency issues on all of my AMD cards on linux.

            seventh, Yes A 4 year old card won't make them more money, but neither will kicking their loyal consumers who have supported them through the garbage like the radeon 200 and 300 series, the same customers who buy their crap like the HD gpus they put out for their ultra low end.
            So, should I be replacing all my ram with newer versions at say, every 4 years? New cpu, new MB, new ram, new Graphics card, new SSD.... all on a 4 year lifespan?

            Comment


            • Originally posted by theriddick View Post
              Yes some games use very little power from RT. But it depends on the game, for example Control used reflections well because it was mostly indoor where the player could easily see the benefits. Also the benefits shown in the latest Metro remaster are good because it uses it for global illumination.
              i talk about this video: https://youtu.be/2VGwHoSrIEU

              most of the games who claim to use raytracing have no benefit of activating raytracing at all. means you get lower FPS without better optics.
              Phantom circuit Sequence Reducer Dyslexia

              Comment


              • Originally posted by mdedetrich View Post
                In any case since its clear you are a bit of an NVidia fanboy, NVidia losing their golden boy (EVGA was NVidia's best AIB with best warranty and highest market share in western markets) means that, to be frank, NVidia royally fuked up. NVidia losing some no name Chinese AIB, not a big issue. Losing your best AIB? I am sorry, those kinds of things only happen when at minimum both parties fuk up, not saying EVGA is totally innocent however claiming that NVidia did nothing wrong is extremely naive and is the kind of thinking that comes from people that will constantly defend something even when its become quite clear its not defendable.
                in my view EVGA build up a brand name of always produce the best gpu card around...

                and i am 100% sure RDNA3 is so fast and cheap that they plain and simple switch to AMD and before this can happen they need to quit Nvidia first.
                Phantom circuit Sequence Reducer Dyslexia

                Comment


                • Originally posted by drakonas777 View Post
                  You don't have to be a successful in business or software development to make a good argument. Regular illogical nonsense from you.
                  the joke is also we have many successfull business people in this forum to ... i for example had 8 years a computer company.

                  he claims stuff like if you make a gpu for 500 and you sell it for 1000 then you have 100% profit ?...

                  he does not unterstand the fact about cost of doing business advertisement costs you an arm and a leg
                  you have to hold back money for the case people have broken hardware and want a repear or replacement years later..
                  you have hold back money in the case people sue you for whatever reason...

                  i know companies who always did sell their stuff at minimum 50% higher than they bought it and they did go bankrupt...
                  because the cost of doing business are much higher than this.

                  Phantom circuit Sequence Reducer Dyslexia

                  Comment


                  • Originally posted by birdie View Post
                    Look where it's taken Linux in the past 30 years. No-fucking-where. A set of semi-broken packages which people love to call an "operating system". It is not.
                    red hat in the last 30 years did go from hobby and foing millions to billions of dollar business ...

                    The statistic depicts the global annual revenue of the open-source software company Red Hat Inc from fiscal year 2009 to 2019.

                    This report helps you learn the RED HAT INC Revenue History by year as well as by each quarter. You will find useful sections including yearly and quarterly revenue charts and the highest and lowest quarters.

                    As part of that major milestone we asked Red Hatters who have been using or contributing to Linux since the early days about their experiences. Today we’re talking to Richard Jones who has been using Linux since the early 1990s, joining Red Hat in 2007. Richard is now a Senior Principal Software Engineer in Red Hat’s R&D Platform team.


                    this means you are a complete liar. in the last 30 years linux did go from hobby to millions and from millions ot billion dollar industry.
                    Phantom circuit Sequence Reducer Dyslexia

                    Comment


                    • Originally posted by lsatenstein View Post
                      So, should I be replacing all my ram with newer versions at say, every 4 years? New cpu, new MB, new ram, new Graphics card, new SSD.... all on a 4 year lifespan?
                      most customers did buy 480 or 580 and this cards had much longer support lifespan... its only the pipecleaner card for 12nm the RX590 who had only 4years of support lifespan... but compared to 480/580 the 590 was only sold in small quantities

                      also there are some technical reasons to ... 480/580 was not tile-based-rendering and Vega64 was the first card with tile-based-rendering

                      also vega cards had larger shader compute parts than they need compared to their texture units the shader power back then was put in because they did plan to implement FSR1/2 for Vega also FSR benefits from 16FP (16 bit floading point) and 480/580 only has 32FP ...

                      for amd it is complicated to develop an old driver for 480/580 with only 32FP support and no 16FP support the lag of shader compute power for FSR and also non-tile-based-rendering and for Vega and RDNA/RDNA tile based rendering...

                      Phantom circuit Sequence Reducer Dyslexia

                      Comment

                      Working...
                      X