Announcement

Collapse
No announcement yet.

GeForce GTX 1070 Looks Great, At Least Under Windows

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by hajj_3 View Post
    Some polaris 10 benchmarks leaked yesterday, it is slightly faster than gtx980/r390 so you wouldn't be getting much of a performance increase than your gtx970. Vega will bring a big advantage though.
    In my opinion, so far there hasn't been any Polaris "benchmark leak" that I would trust.

    Comment


    • #32
      Originally posted by bug77 View Post
      It fails even at that. Because we've always had high, mid and low end. But Titan is in a league of its own (it retained compute abilities even when consumer cards did not) and is being thrown in there to make it look like the prices have risen. Yes, the most expensive card has become more expensive, but only because a new performance tier is available now.
      Once more, this is only about higher-end, which this thread is all about.
      No. The original Titan had more FP64 performance which was nice, but that's it. And you still lack the software support of Quadro/Tesla cards. The newer Titan X has no abilities that other consumer cards don't have at all - apart from more memory, of course. AMD mostly had more memory and never took a premium charge for that.

      Originally posted by erendorn View Post
      Well, inflation between 1999 and 2016 says that 400$ back then correspond to 574$ now. So top end graphic cards have increased a bit more than inflation, but not that much more (except for the two titans, but they are clearly outliers). Way less than "top end phone", for example.
      No, Titans are no outliners, they are the perfectly normal big chip sold at incredible price, even as salvage(!), starting with Kepler. They are just labelled outliners from the marketing to rise prices.
      What would make you think, these are outliners from a technical view? The performance can't definately be it. 20-30% above 7970 GHz, then below the 290X for the original Titan, on par with Fury X for the Titan X.

      The 1080/1070 are not the new high-end cards, those will be the upcoming 1080 Ti and Titan. So you can't really say that prices have not increased before these are out. And it's quite obvious that the 1080 Ti would be priced higher than the 1080, which is already priced above the 980 Ti.

      You do realise, which cards you have to compare?
      1080 is the successor of the 980, 680, 560 Ti
      1070 is the successor of the 970, 670, 560
      1080 Ti will be the successor of 980 Ti, 780, 570
      Titan (Pascal) will be the successor of Titan X, Titan Black and 580

      Don't get yourself fooled by advertising
      Last edited by juno; 30 May 2016, 08:26 AM.

      Comment


      • #33
        Originally posted by juno View Post
        Once more, this is only about high-end, which this thread is all about.
        No. The original Titan had more FP64 performance which was nice, but that's it. And you still lack the software support of Quadro/Tesla cards. The newer Titan X has no abilities that other consumer cards don't have at all - apart from more memory.
        Once more, isn't 690 high-end? This isn't about high-end, is about high-end that suits you.


        Originally posted by juno View Post
        No, just get it. Titans are no outliners, they are the perfectly normal big chip sold at incredible price, even as salvage(!), starting with Kepler. The 1080/1070 are not the new high-end cards, those will be the upcoming 1080 Ti and Titan. So you can't really say that prices have not increased before these are out. And it's quite obvious that the 1080 Ti would be priced higher than the 1080, which is already priced above the 980 Ti.
        Titans are outliers. Just like Intel always has a $999 chip on offer, so does Nvidia. This is not high-end, this how you show that you have something the competition can't touch.
        By your own logic, if Titans are the real high end, making x80/x70 mid-range, why does AMD's mid-range suck so bad in comparison?

        PS If you want to fault Nvidia for something, fault them for this: http://hardocp.com/article/2016/05/2...for_20_minutes
        not for random graphs off the Internet.

        Comment


        • #34
          Originally posted by bug77 View Post
          Once more, isn't 690 high-end? This isn't about high-end, is about high-end that suits you.
          690 would be a real outliner, as it uses two GPUs. What's your problem with understanding that?
          You do realise that you can go on and add the dual gpus into the plot, but that will make the picture look even worse!?

          Originally posted by bug77 View Post
          Titans are outliers. Just like Intel always has a $999 chip on offer, so does Nvidia. This is not high-end, this how you show that you have something the competition can't touch.
          By your own logic, if Titans are the real high end, making x80/x70 mid-range, why does AMD's mid-range suck so bad in comparison?

          PS If you want to fault Nvidia for something, fault them for this: http://hardocp.com/article/2016/05/2...for_20_minutes
          not for random graphs off the Internet.
          Well stop it already. "the competition can't touch" is definitely not only ridiculous 20-30% perf+ for 100% price+ and then being catched up by the next card within the same year for half the price.

          What does suck exactly in your opinion? Can you name the cards that suck in comparison to the Nvidia parts? AMD does not have the ressources to release a completely new lineup every year, the transition is more incremental and fluid.
          Currently, AMD's high-end chip is Fiji, mid-range (or how they call it "performance") is Hawaii, below comes Tonga. They compete quite nicely with GM200, GM204 and GM206, so I don't really get your point.

          But nice that you mention Intel. They are really out of competition currently, opposed to Nvidia and they didn't rise the prices anywhere close to this, here are the prices of the top-end desktop/consumer quad cores for the last four Generations:
          Sandy Bridge (i7-2600k/2700k): 332$
          Ivy Bridge (i7-3770k): 332$
          Haswell: (i7-4770k/4790k) 339$
          Skylake: (i7-6700k) 339$

          Oh, you are talking about the biggest ones for consumers?
          Sandy Bridge-E (6C): 999$
          Ivy Bridge-E (6C): 999$
          Haswell-E (8C!): 999$
          Skylake-E: TBA

          And please don't compare Xeons with consumer products. That would fit better to FirePro W/S/Tesla/Quadro

          I spare you the repeated listing of prices for Nvidia's high-end GPUs for the last four generations.

          Stop being ridiculous, already

          Gosh, this makes me look like an AMD fanboy from hell, which I'm not (I'm preferring AMD but already stated that the new Pascal chips are nice and I'm not biased in rating the actual products, I'm just biased by other factors that would make me buying an AMD in favor of an exactly same performing, exactly same priced Nvidia). But I just can't stand wrong "facts" posted all over the internet, influencing others in their opinions. The last example was for the "4 GiB HBM is worth more than 6 GiB GDDR5" statements by AMD fans, btw. I just hate bs.
          Last edited by juno; 30 May 2016, 09:05 AM.

          Comment


          • #35
            Nvidia is pretty good at making money. Demand for their lowest-end / OEM stuff has gone down, but demand for the higher end has gone up. Plus they introduced a whole new price level in the Titan cards that the market responded well to. And filling in those market needs is how you bring in the Benjamins.

            Comment


            • #36
              Originally posted by juno View Post
              Well stop it already. "the competition can't touch" is definitely not only ridiculous 20-30% perf+ for 100% price+
              That's how the game is played at the top. You don't get the last 20% of performance for merely 20% extra cash. In any industry. But you don't understand this, otherwise you'd understand why the Titans and Intel's Extreme Edition CPUs are outliers.
              Originally posted by juno View Post
              and then being catched up by the next card within the same year for half the price.
              AMD never sold anything as fast as Nvidia's counterpart for half the price. And don't say Fury X vs Titan X, because the former's memory will make sure there's stuff where it will lag. At it didn't launch at half the price of the Titan X.
              Originally posted by juno View Post
              ...
              Sandy Bridge (i7-2600k/2700k): 332$
              Ivy Bridge (i7-3770k): 332$
              Haswell: (i7-4770k/4790k) 339$
              Skylake: (i7-6700k) 339$

              Oh, you are talking about the biggest ones for consumers?
              Sandy Bridge-E (6C): 999$
              Ivy Bridge-E (6C): 999$
              Haswell-E (8C!): 999$
              Skylake-E: TBA
              ...
              That's what your graph shows, too. Titan pegged at $999 and the next tier only slightly rising if you consider inflation.

              At the end, what's your point? AMD is trying to sell us cheaper cards, but evil Nvidia is driving them up?

              Comment


              • #37
                Originally posted by bug77 View Post
                That's how the game is played at the top. You don't get the last 20% of performance for merely 20% extra cash. In any industry. But you don't understand this, otherwise you'd understand why the Titans and Intel's Extreme Edition CPUs are outliers.
                No, the difference is that I understand what an outliner is and you don't. Titans are not. Intel Extreme Editions are, as there is no competition for that, not currently, neither in sight.
                BTW: Intel Extreme Edition i7 are for enthusiast consumers, this has nothing to do with the top or the last 20%. That is where Xeon reside. Besides from that, Intel actually effectively lowered the price for their S2011 parts (8C pushed in, 6C pushed down). And that's what normal progression in this industry actually is. You do realise, that smaller chips are cheaper to produce? With new foundry technology, chips get smaller. Imagine since the beginning of computing if chips always get more expensive in relation to the performance improvements, today nobody could even afford a PC.

                Originally posted by bug77 View Post
                AMD never sold anything as fast as Nvidia's counterpart for half the price. And don't say Fury X vs Titan X, because the former's memory will make sure there's stuff where it will lag. At it didn't launch at half the price of the Titan X.
                Thanks for proving me that you did not read any of my statements. I clearly said Titan, not Titan X. Yeah the 290X's 550 $ was not exactly 50% of 1000$ but - really?

                Originally posted by bug77 View Post
                That's what your graph shows, too. Titan pegged at $999 and the next tier only slightly rising if you consider inflation.
                Yeah, rising "slightly" for every gen. That's the whole point I made at the beginning, before you let the discussion escalate by denying the credibility of this simple chart.

                Originally posted by bug77 View Post
                At the end, what's your point? AMD is trying to sell us cheaper cards, but evil Nvidia is driving them up?
                No, that was your interpretation of the chart, calling it biased and you savaged hard on that. But the funny part is that it's true at the end
                Last edited by juno; 30 May 2016, 10:37 AM.

                Comment


                • #38
                  Originally posted by juno View Post
                  No, Titans are no outliners, they are the perfectly normal big chip sold at incredible price, even as salvage(!), starting with Kepler. They are just labelled outliners from the marketing to rise prices.
                  What would make you think, these are outliners from a technical view? The performance can't definately be it. 20-30% above 7970 GHz, then below the 290X for the original Titan, on par with Fury X for the Titan X.

                  The 1080/1070 are not the new high-end cards, those will be the upcoming 1080 Ti and Titan. So you can't really say that prices have not increased before these are out. And it's quite obvious that the 1080 Ti would be priced higher than the 1080, which is already priced above the 980 Ti.
                  Err, any statistical method on this graph would classify the two titans as outliers. Drawing a trend using these points reminds me of people saying global warming stopped in 1998.
                  I can also draw a trend from 2012 to 2014 which predict that graphic cards will sell for negative prices soon enough. Or go the other way around (say, 2014 - 2015) and estimate that GPUs will soon be sold for two or three k.
                  It is certainly possible (quite likely, even) that nvidia will continue selling cards above 1000$ from time to time.
                  But on this graph, the titans can be considered either as outliers or as a regime change (introduction of a new pricepoint), but in no way can they be used to support a trend statistically speaking.


                  Comment


                  • #39
                    Originally posted by juno View Post
                    Yeah, rising "slightly" for every gen. That's the whole point I made at the beginning, before you let the discussion escalate by denying the credibility of this simple chart.
                    I did deny the credibility of the chart because if you don't read it right it, it induces the idea that Nvidia is constantly raising prices. Whereas, when you account for the inflation, the prices have barely risen at all.
                    And I also challenged that high-end pricing is relevant, without looking at what happens in the mid-range.

                    Comment


                    • #40
                      Originally posted by erendorn View Post
                      Err, any statistical method on this graph would classify the two titans as outliers. Drawing a trend using these points reminds me of people saying global warming stopped in 1998.
                      I can also draw a trend from 2012 to 2014 which predict that graphic cards will sell for negative prices soon enough. Or go the other way around (say, 2014 - 2015) and estimate that GPUs will soon be sold for two or three k.
                      You are viewing that from another perspective. You only include the actual price. Others also include value and state that the Titan is an outliner technically. But that's not true and I think I made that quite clear. Read your quoted passage again:

                      Originally posted by juno View Post
                      No, Titans are no outliners, they are the perfectly normal big chip sold at incredible price
                      What would make you think, these are outliners from a technical view?
                      Of course I don't disagree on it being an outliner from the pure price perspective.
                      But technically, it is a perfectly normal development, it has nothing special that would be unexpected or that other products don't have.

                      Originally posted by erendorn View Post
                      It is certainly possible (quite likely, even) that nvidia will continue selling cards above 1000$ from time to time.
                      But on this graph, the titans can be considered either as outliers or as a regime change (introduction of a new pricepoint), but in no way can they be used to support a trend statistically speaking.
                      It is not just likely anymore when the strategy has begun few years ago and been continued twice, imho. Not selling a >=1k$ Titan in this generation would be a paradigm change already.
                      It's definately the "introduction of a new price point", that's what I tried to make clear all the time. But there is just no legitimate reason that would justify this. And this price point affects the lower cards too.

                      Comment

                      Working...
                      X