Announcement

Collapse
No announcement yet.

GeForce GTX 1070 Looks Great, At Least Under Windows

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Michael
    replied
    Just placed an order on the GTX 1080 as the card I was told last week I would have from NVIDIA apparently didn't pan out...

    Leave a comment:


  • juno
    replied
    Originally posted by johnc View Post
    SoC / LP 16nm FF is a completely different process than what is used for the GPUs. Yes, the GPU companies share the R&D burden and re-tooling costs. Do you really think 16nm FF+ is cheaper for nvidia than 28nm was for Maxwell?
    16FF+ is just a rather slight advancement over 16FF and you know that. 16FF+ and 14LPE were for the early birds, they've been late already with FinFET and <20nm.
    I don't know the exact actual wafer cost and yields for 16FF+, neither for 28 HPP, but you can you sure give me the numbers?
    Last info I got (and that's quite some time ago) was that the per working transistor cost is about equal to 28 nm. Now that depends on multiple factors, as you know but GP104 has less transistors, half the area and relatively lower density (if you virtually blow it up on 28nm) compared to GM200, so I think we could assume that the chip itself is on a similar level, with falling cost for the new one as the yields are improving. Then, you need smaller power supply, only 2/3 of the RAM, the PCB gets simpler and cheaper. OK you need a few more wires for GDDR5X and the memory itself is a bit cheapter, but that only applies on the 1080.

    Originally posted by johnc View Post
    But you don't know what you're talking about.
    Mhm then why don't you teach me? With sources, please.
    Last edited by juno; 31 May 2016, 03:06 AM.

    Leave a comment:


  • johnc
    replied
    Originally posted by juno View Post
    Um, maybe also those, that have been using the 16 nm FinFET processes at TSMC for years, producing and selling dozens of millions of units and counting like Apple or Qualcomm?
    Times change, due to their mass and huge amount of market penetration, SoCs are the real big deal for foundries today and they are at the leading edge, not cheap crappy chips anymore. The big GPU manufacturers are quite late to the FinFET party. Others are already preparing their 10nm products.
    SoC / LP 16nm FF is a completely different process than what is used for the GPUs. Yes, the GPU companies share the R&D burden and re-tooling costs. Do you really think 16nm FF+ is cheaper for nvidia than 28nm was for Maxwell?

    I'm not complaining, I'm just invalidating bogus statements.
    But you don't know what you're talking about.

    Leave a comment:


  • juno
    replied
    Originally posted by johnc View Post
    Who do you think paid for 16nm FF+ R&amp;D at TSMC? Mary Poppins?
    Um, maybe also those, that have been using the 16 nm FinFET processes at TSMC for years, producing and selling dozens of millions of units and counting like Apple or Qualcomm?
    Times change, due to their mass and huge amount of market penetration, SoCs are the real big deal for foundries today and they are at the leading edge, not cheap crappy chips anymore. The big GPU manufacturers are quite late to the FinFET party. Others are already preparing their 10nm products.

    Originally posted by johnc View Post
    Do you think GDDR5X is cheap?
    No, I don't, but you surely have the exact details?

    Originally posted by johnc View Post
    Nvidia is giving you Titan X performance for $400. What are you complaining about?
    I'm not complaining, I'm just invalidating bogus statements.

    Leave a comment:


  • johnc
    replied
    Originally posted by juno View Post
    No, that's just wrong. The 1070/1080 are way cheaper to produce than a Titan X/980 Ti.
    Who do you think paid for 16nm FF+ R&D at TSMC? Mary Poppins? Do you think GDDR5X is cheap?

    Nvidia is giving you Titan X performance for $400. What are you complaining about?

    Leave a comment:


  • juno
    replied
    Originally posted by johnc View Post

    Of course there's a legitimate reason: to make money. People want these products and are willing to pay.

    They're also getting more expensive to make.
    Not the kind of reason I thought of.
    Of course it is legitimate for them to rise prices and widen margins, but IMHO no value could justify that, but that's just an opinion, obviously.

    No, that's just wrong. The 1070/1080 are way cheaper to produce than a Titan X/980 Ti.

    Originally posted by bug77 View Post
    This is where you're wrong.
    Nvidia puts an insane (well, whatever) amount of transistors on their dies, knowing very well the chip will have a slim chance of getting good yields. When they start making them, as expected, chips have faults, they disable what has to be disabled and sell you your chips. The few fully working chips are binned and sold as Titans (with a price to match). If, down the road, the manufacturing process matures enough and yields improve significantly, you'll get fully working chips used in the x80Ti line.
    But Titan, for Nvidia, is just a halo product. That's why it's not under the GeForce moniker.
    No, you are the one being wrong, but I see that you just can't accept that. And now we have a huge load of FUD again.
    The transistor density Nvidia is driving is actually quite low compared to competitor's products and it has been for quite some time. Plus, the yields doesn't really depend too much on some mio transistors/mm² but more the architectural design of the chips and the foundry's technology of course. Actually, AMD's GPUs have way higher density and the salvage chips are way less crippled. I don't have exact numbers about yields at hand, but considering the financial situation of both companies, I don't think that AMD has more yield problems, as they just can't afford throwing much chips away. So as long as you can't really get me a statistical prove for your statement, your argument is invalid.

    You do know, that it's not true that the best binned chips are sold as Titan right?
    Nvidia introduced the Titan label with a salvage GK110 chip. It was followed up by a 780 Ti and Titan Black with a full GK110 just because Hawaii was too strong. Also, for the Maxwell series there has not been any full GM200 for a x80Ti series card, neither do I expect that for Pascal. It doesn't make sense anymore in their strategy.
    Yes tell me more about the "not under the GeForce moniker" product, that is labelled as GeForce GTX Titan, clearly placed in the consumer market and advertised as gaming and consumer product all over...
    Last edited by juno; 30 May 2016, 08:44 PM.

    Leave a comment:


  • Kano
    replied
    I am sure this card will be fine for Linux too. I want to see vdpauinfo with Nvidia Pascal...

    Leave a comment:


  • johnc
    replied
    Originally posted by juno View Post
    It's definately the "introduction of a new price point", that's what I tried to make clear all the time. But there is just no legitimate reason that would justify this. And this price point affects the lower cards too.
    Of course there's a legitimate reason: to make money. People want these products and are willing to pay.

    They're also getting more expensive to make.

    Leave a comment:


  • bug77
    replied
    Originally posted by juno View Post
    But technically, it is a perfectly normal development, it has nothing special that would be unexpected or that other products don't have.
    This is where you're wrong.
    Nvidia puts an insane (well, whatever) amount of transistors on their dies, knowing very well the chip will have a slim chance of getting good yields. When they start making them, as expected, chips have faults, they disable what has to be disabled and sell you your chips. The few fully working chips are binned and sold as Titans (with a price to match). If, down the road, the manufacturing process matures enough and yields improve significantly, you'll get fully working chips used in the x80Ti line.
    But Titan, for Nvidia, is just a halo product. That's why it's not under the GeForce moniker.

    Leave a comment:


  • juno
    replied
    Originally posted by erendorn View Post
    Err, any statistical method on this graph would classify the two titans as outliers. Drawing a trend using these points reminds me of people saying global warming stopped in 1998.
    I can also draw a trend from 2012 to 2014 which predict that graphic cards will sell for negative prices soon enough. Or go the other way around (say, 2014 - 2015) and estimate that GPUs will soon be sold for two or three k.
    You are viewing that from another perspective. You only include the actual price. Others also include value and state that the Titan is an outliner technically. But that's not true and I think I made that quite clear. Read your quoted passage again:

    Originally posted by juno View Post
    No, Titans are no outliners, they are the perfectly normal big chip sold at incredible price
    What would make you think, these are outliners from a technical view?
    Of course I don't disagree on it being an outliner from the pure price perspective.
    But technically, it is a perfectly normal development, it has nothing special that would be unexpected or that other products don't have.

    Originally posted by erendorn View Post
    It is certainly possible (quite likely, even) that nvidia will continue selling cards above 1000$ from time to time.
    But on this graph, the titans can be considered either as outliers or as a regime change (introduction of a new pricepoint), but in no way can they be used to support a trend statistically speaking.
    It is not just likely anymore when the strategy has begun few years ago and been continued twice, imho. Not selling a >=1k$ Titan in this generation would be a paradigm change already.
    It's definately the "introduction of a new price point", that's what I tried to make clear all the time. But there is just no legitimate reason that would justify this. And this price point affects the lower cards too.

    Leave a comment:

Working...
X