Announcement

Collapse
No announcement yet.

AMD tops Nvidia in graphics chip shipments

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • AMD tops Nvidia in graphics chip shipments

    Interesting news. AMD shipped more graphics cards in the second quarter than NVidia.

    http://news.cnet.com/8301-13924_3-20012025-64.html

  • #2
    Charlie's commentary, though hyperbolic, is largely correct IMHO.

    Comment


    • #3
      Originally posted by krazy View Post
      Charlie's commentary, though hyperbolic, is largely correct IMHO.
      I certainly agree that Nvidia's problems are their own creation -- Fermi is a lousy design for a GPU -- but I can't really agree with his conclusions. Integrated graphics killed the low end of the consumer market long ago and the margins are at the high end of the market... Nvidia's real problem is that their margins aren't high because their design is extremely inefficient compared to ATI's; in the benchmarks I've seen ATI seems to get about 2x the performance per transistor.

      DX11 is pretty much irrelevant to most of the market as I'm not aware of any game that won't even run on a DX9 card, let alone requiring DX11.

      Comment


      • #4
        The thing with integrated graphics is that it is moving from low-end into middle-ground, leaving discrete to enthusiasts ONLY.

        The reason why nvidia is the only one being affected by all of this is that they are strongly tied to discrete whereas AMD and Intel together have the integrated segment of the market cornered. Yes I realize that nvidia has some IGP products, problem is that there is no incentive to buy it over the "platform" solutions.

        In other words, why would you want to pay an extra $100-$500 for a discrete nvidia card when your needs are met 100% by an integrated radeon?

        Comment


        • #5
          Originally posted by droidhacker View Post
          The thing with integrated graphics is that it is moving from low-end into middle-ground, leaving discrete to enthusiasts ONLY.
          Nvidia's chipsets have never had a good reputation though the Ion seems to be OK; I think they'd need to do a lot of work to convince manufacturers to use them. I'm not aware of Intel producing anything but the lowest of the low-end of performance in their integrated graphics chips, which are in the vast majority of PCs... ATI chipsets are largely irrelevant so long as most PCs use Intel CPUs.

          Plus the profit margins are probably minute on the integrated chips; so while it's nice to be shipping 100,000,000 integrated graphics chips you'll probably make the same profit by shipping 1,000,000 'enthusiast' boards. Unless you're forced to push margins down to get sales because your architecture is less efficient than the competition...

          Comment


          • #6
            I think nv wants to create a competitor to atom. They bought LongRun technology from Transmeta and they hired former Transmeta devs. Maybe they will have got more luck next year with tegra powered smartbooks running android/google os. Of course for office pcs the market for dedicated gfx solutions is basically over. Only gamers need em and there nv had only very expensive models with dx11 support. This year ati had to win but next year is open.

            Comment


            • #7
              Originally posted by Kano View Post
              I think nv wants to create a competitor to atom. They bought LongRun technology from Transmeta and they hired former Transmeta devs. Maybe they will have got more luck next year with tegra powered smartbooks running android/google os. Of course for office pcs the market for dedicated gfx solutions is basically over. Only gamers need em and there nv had only very expensive models with dx11 support. This year ati had to win but next year is open.
              open for what ? next year nvidia goes bankrupt ;-)

              'transmeta' style cpu nice but without any chance (in the past)

              an CPU from an cloused source company like nvidia can't be suceessfull.

              Comment


              • #8
                Tegra (2) has a licenced ARM core. Also it supports video accelleration (even Flash 10.1 support for Android!). From the specs a very nice cpu. Of course a x86 cpu would be even more interesting.

                Comment


                • #9
                  Originally posted by Kano View Post
                  Tegra (2) has a licenced ARM core. Also it supports video accelleration (even Flash 10.1 support for Android!). From the specs a very nice cpu. Of course a x86 cpu would be even more interesting.
                  nvidia do have a nativ x86 cpu ;-) but its a intel compatible 386 cpu. with no FPU and no SSE and no out of order Architektur,

                  no i do not make fun on you... its a embedded cpu nvidia buy this cpu with a company 'ALI'. Nvidia (M6117C - 386SX)

                  i'm waiting for a system nvidia 386sx +gtx480 LOL...

                  Comment


                  • #10
                    Originally posted by movieman View Post
                    Nvidia's chipsets have never had a good reputation though the Ion seems to be OK; I think they'd need to do a lot of work to convince manufacturers to use them. I'm not aware of Intel producing anything but the lowest of the low-end of performance in their integrated graphics chips, which are in the vast majority of PCs... ATI chipsets are largely irrelevant so long as most PCs use Intel CPUs.
                    Is 30% of the market irrelevant?
                    http://www.cpubenchmark.net/market_share.html

                    May be smaller than intel, but that doesn't mean they're small.

                    Plus the profit margins are probably minute on the integrated chips; so while it's nice to be shipping 100,000,000 integrated graphics chips you'll probably make the same profit by shipping 1,000,000 'enthusiast' boards. Unless you're forced to push margins down to get sales because your architecture is less efficient than the competition...
                    You've got it backwards. There's hardly any margin on high-end boards. It all goes into reclaiming the cost of development. The per-unit cost of an IGP is negligible -- the margin is huge there because there is no R&D required to keep punching out old stuff!

                    Works like this;
                    You spend (for argument's sake) $10 million in R&D for a new chip. The actual production cost is $10 each. You sell them for $510 each, so for each chip, you dump $500 off the DEBT created in development. (yeah, yeah, there's interest on the debt too, lets just ignore that for now.) When you've sold 20000 units, you've paid off the entire debt, drop the price from $510 down to... $50. Suddenly you go from netting $0 per unit to $40 and you're selling 100x as many units!

                    Now obviously I changed certain parts of the picture for convenience sake. In the real world, you don't drop the R&D debt suddenly like that and drop the price overnight, it is more of a gradual reduction in the price as the R&D gets paid off. Nor is the R&D just a fixed number like this -- more like you have a constant R&D expense which you're paying off with a declining pricing structure.

                    The key to remember is that they ARE getting a good net on the low-end parts!

                    Comment


                    • #11
                      Originally posted by droidhacker View Post
                      Is 30% of the market irrelevant?
                      http://www.cpubenchmark.net/market_share.html

                      May be smaller than intel, but that doesn't mean they're small.
                      Ah.... 2005-2006. AMD really had a shot at flipping that graph in 2006 with their AMD64 line. Nostalgia.

                      On the discrete note, Fermi isn't that bad of an architecture, it just took too long to get here. I have a 470 which can chew up anything I throw at it, and contrary to what most people tell me (including the Sales Associate who sold me the card - he tried to talk me out of it) it doesn't run supernova on air cooling.

                      Its just all too little too late. I was already well sick of waiting for the cards to debut months before they hit shelves, and while I was enthusiastic about getting some new hardware it kind of fell off the radar until the day I walked by it on the shelf. So while it outbenches the 5000 series in tessellation, it was just another upgrade. "w00t, I can plug two more monitors in!", not "w00t new card!".

                      I think NVIDIA needs to break back into the chipset business. Regardless of the flaming, ION is a pretty successful platform and integrated NVIDIA chipsets were not (always) total sh!t. I think it is going to be hard for the company to break in to AMD and Intel chipsets - Intel + NVIDIA is a bittersweet marriage between two companies which really don't like eachother but need to compete with the company that now does both (AMD) rather than real drive to get the competitive edge.

                      NVIDIA needs one of the following:
                      -Come out with an AmazingCard™ that blows everyone away before any competitor can react, let up on the rebranding
                      -Somehow, get back onto AMD and Intel boards. Buy Lucid or something, find a way in. I guarantee AMD + Intel are holding that door shut.
                      -Get their own x86 action going on, and graduate from a graphics chipset manufacturer to a chipset manufacturer.

                      I think #3 is their long term ticket to success, if they can release something well designed and competitive. Ultimately it is contributing the the stagnation of chip progress (x86 blows chunks) but it would probably save the company from the Voodoo fate. #2 would also work well if they can get AMD+Intel to open the floodgates, but #1 isn't happening with their current business model. Their life raft right now is the 460.

                      Maybe something else will happen, I don't know. Maybe OpenGL and DirectX will die in favor of raytracing, and then it wont matter how much anyone rebrands.

                      Comment


                      • #12
                        Originally posted by droidhacker View Post
                        You've got it backwards. There's hardly any margin on high-end boards. It all goes into reclaiming the cost of development. The per-unit cost of an IGP is negligible -- the margin is huge there because there is no R&D required to keep punching out old stuff!
                        Admittedly it was a few years ago, but when I was working in the graphics industry the margin on high-end boards was huge whereas the margin on low-end boards was so low that people would spend weeks figuring out how to save a few cents on component costs.

                        Comment


                        • #13
                          Originally posted by kazetsukai View Post
                          I think NVIDIA needs to break back into the chipset business. Regardless of the flaming, ION is a pretty successful platform and integrated NVIDIA chipsets were not (always) total sh!t.
                          Ion is great for a low-power system like my HTPC. The problem as I see it is Intel refusing to let Nvidia hook their Ion chipsets up to the new Atoms (I believe that's still the case?).

                          I think Nvidia's long-term problem is the decline of the PC gaming market; if most PC gamers are playing Farmville rather than 'Super Whizzbang Shooter 23' that requires the world's fastest graphics card, then there's no need for a gaming card. Of course if most people are playing Farmville rather than SWS23, then Microsoft are going to be in big trouble too, since Farmville runs on any OS with a Flash plugin.

                          Comment


                          • #14
                            I remember the nForce2 being a very good and successful chipset back in the good old days before cpu makers stopped licensing their proprietary bus interfaces. It's too bad there is no chance of more competition in the chipset market right now.
                            Ah.... 2005-2006. AMD really had a shot at flipping that graph in 2006 with their AMD64 line.
                            That could have happened if it wasn't for intel's anti-competitive actions.

                            Comment


                            • #15
                              Originally posted by devius View Post
                              That could have happened if it wasn't for intel's anti-competitive actions.
                              As far as I remember, AMD's fabs couldn't possibly make enough chips to supply 70% of the market at that time?

                              Comment

                              Working...
                              X