Page 8 of 9 FirstFirst ... 6789 LastLast
Results 71 to 80 of 89

Thread: AMD Releases FX-Series Bulldozer Desktop CPUs

  1. #71
    Join Date
    Nov 2007
    Posts
    1,353

    Default

    Quote Originally Posted by Kivada View Post
    Nope, "mindshare" means nothing, marketing to consumers means nothing, all that matters is selling parts to OEMs. You know, Dell, HP/Compaq, Acer, eMachines, Sony, Lenovo, Toshiba, Hitachi, Asus, MSI. The box makers that sell ridicules amounts of machines with windows+crapware in big box stores.

    Consumers don't pick a comp based on it's hardware, they do so based on it's price, they want sub $500 machines that are using maybe $200 worth of hardware at best, so burning money trying to advertise to consumers means very little since they're just going to buy whatever the snot nosed kid in a blue shirt with a yellow name tag puts them in front of because thats what his manager tells him to do as the store may make a higher margin off it's sale.

    This is why you still see machines being sold as new to this day based on the Nvidia 6150 chipset.
    Lets just say I disagree heartily and leave it at that... The past may not be a mirror for the future, but it certainly has an uncanny resemblance..

    And your exactly right that AMD's customers are not you and me they are the OEM's and Channel Partners. But they are going to buy what they feel is best for THEIR customers... Their customers buy simply what they have available. And that is my point about mindshare...

    Oh and the reason the 6150 is still being sold today is nothing more then because there is still stock.. Thats nVidias fault and noone elses.

    EDIT: If AMD wants to improve product lineups then they need to get their marketing guys off of the engineering floor and on to the marketing floor. And instead of targeting consumers they need to target their customers. If they want to improve consumer interest, then they need to build a word of mouth campaign... As the situation stands right now, when some kids mom says, "hey son, what do you think about this computer?". Hes gonna look at it and say, "Oh that has a Bulldozer processor, dont even think about it."

    And that is what mindshare does. It's far more pervasive and powerful then what people think. A word of mouth campaign is powerful, but it needs a product worthy to support it.

    I'll simply point you to look at marketshare trends in the months following new product releases. If you look at the line graphs you'll see a clear relationship between those products considered to have poor performance and those considered to have good performance with respect to marketshare. That is the power of mindshare.
    Last edited by duby229; 10-13-2011 at 12:34 PM.

  2. #72
    Join Date
    Apr 2010
    Location
    France
    Posts
    16

    Default

    Quote Originally Posted by mcirsta View Post
    Thanks a lot, I think this perfectly illustrates what a bad CPU the new Bulldozer really is. It's just the best review I've seen so far. I've decided to just go out an buy a good old Phenom II x4 before it gets more expensive when people realize what a good value it is.
    Oh well, as they say , better luck next time.
    Looking at the figures I still can't understand why AMD even created Bulldozer ? They could have just built a 32 nm Phenom II , even an 8 cored one would have had less transistors and those would have been 8 full cores, with some small improvements added it would have been way better than this. If they added hyperthreading to that .... an almost pefect CPU.
    I don't think it is such a bad CPU. They added a lot of features compared to Phenoms : SSE4.1, SSE4.2, AVX, XOP, FMA4, power gating, and I believe this adds transistors. The number of transistors doesn't surprise me when compared to Thuban, but it does when compared to Sandy Bridge which manages equal or better performance with half the transistor count. It may also be possible that softwares used for testing didn't enabled these new instructions because no AMD processor supported them before.

    Still, in my opinion we shouldn't watch the transistor count, but the other variables that depend on it : power consumption, price and performance. Zambezi idle power consumption is better than Thuban and sligthly higher when fully charged, but who taxes a CPU all day long anyway ? Single threaded performance is a bit lower but when multithread enters the room, Zambezi wins. I have to admit though that price is too high for Zambezi to really be competitive.

    In the end it's definitely a more modern CPU than Thuban, even if its performance doesn't obviously reflect it. It is a good foundation for Piledriver. If Piledriver really improves IPC by 10% and they are finally able to reach higher frequencies, then we might have a very good CPU.

  3. #73
    Join Date
    Oct 2011
    Posts
    2

    Default

    There is one French review of the FX where they used the Phoronix Test Suite (PTS in their table):
    http://www.pcinpact.com/articles/amd-fx-8150/420-5.htm

  4. #74
    Join Date
    Feb 2009
    Posts
    5

    Default

    As pointed out in an italian forum ( hwupgrade, post from bjt2 for the italian guys reading here) probably 2 are the big problems of BD:
    the 32 nm (llano is a K10 and reach only 2.9 Ghz) and the branch prediction.
    BD was designed for high clock that the actual productive process cannot reach, so for this problem AMD is in the hand of Global Foundries.
    For the branch prediction I don't now if it can be solved with some revisions or if is needed a new design (BD2 ? )

    P.S. sorry for my bad english

  5. #75

    Default

    Quote Originally Posted by duby229 View Post
    Lets just say I disagree heartily and leave it at that... The past may not be a mirror for the future, but it certainly has an uncanny resemblance..

    And your exactly right that AMD's customers are not you and me they are the OEM's and Channel Partners. But they are going to buy what they feel is best for THEIR customers... Their customers buy simply what they have available. And that is my point about mindshare...

    Oh and the reason the 6150 is still being sold today is nothing more then because there is still stock.. Thats nVidias fault and noone elses.

    EDIT: If AMD wants to improve product lineups then they need to get their marketing guys off of the engineering floor and on to the marketing floor. And instead of targeting consumers they need to target their customers. If they want to improve consumer interest, then they need to build a word of mouth campaign... As the situation stands right now, when some kids mom says, "hey son, what do you think about this computer?". Hes gonna look at it and say, "Oh that has a Bulldozer processor, dont even think about it."

    And that is what mindshare does. It's far more pervasive and powerful then what people think. A word of mouth campaign is powerful, but it needs a product worthy to support it.

    I'll simply point you to look at marketshare trends in the months following new product releases. If you look at the line graphs you'll see a clear relationship between those products considered to have poor performance and those considered to have good performance with respect to marketshare. That is the power of mindshare.

    You are giving far to much credit to massive multinational megacorp OEMs that do everything in their power to cut corners at all costs, just look at the whole "Vista Ready" debacle of incredibly old hardware still being sold as new with at most 512Mb of ram causing heaps of hate mistakenly placed on MS, not that Vista wasn't a turd sandwich, but allot of it's problems where sorted at 2Gb of ram and a dual core 64 bit CPU.

    As for the 6150, I don't buy that, it's a woefully outdated chipset that if I'm not mistaken, it as well as it's Geforce 7 derived 7050 chipset have been put into life support only driver set, while the 8200/ION and 9200/ION2 chipsets are compatible with the same hardware, offer much better performance across the board, with updated video codec acceleration, Flash scaling acceleration and a boost to 3D that makes mass market games like The Sims playable. So no, I don't believe that marketing plays any role at all, especially with OEMs as they only want to cut another penny out of the cost be it in crappy mobos, shitty PSUs, janky ram or the lives of the people that they hire in 3rd world countries who's working conditions are so bad they feel death is their only choice.
    Last edited by Kivada; 10-13-2011 at 11:07 PM.

  6. #76
    Join Date
    Dec 2008
    Posts
    315

    Default

    Quote Originally Posted by Kivada View Post
    You are giving far to much credit to massive mmultinational megacorp OEMs that do everytihng in their power to cut corners at all costs, just look at the whole "Vista Ready" debacle of incredibly old hardware still being sold as new with at most 512Mb of ram causing heaps of hate mistakenly placed on MS, not that Vista wasn't a turd sandwich, but allot of it's problems where sorted at 2Gb of ram and a dual core 64 bit CPU.
    No vista was the first OS with a GPU scheduler. It sucked so bad because it's a latency slow turd sandwich nightmare. Not just a regular turd sandwich. The problems weren't sorted by 2gb of ram and dual core 64 bit cpu's. The problems were sorted at throwing 8 times more processing power at the turd sandwich scheduler. Windows 7 is supposed to be a whole lot better. It at least manages sub 100 microsecond latencies with modern overclocked cpu's and gpu's. Something vista probably can't even manage to this date. The problems on both vista and windows 7 is that when it doesn't have a turd sandwich driver for the turd sandwich hardware it runs home to momma and runs whatever function it can't do in silicon off a software rasterizer. So windows vista pretty much ate your cpu for the first year it was out. Windows 7 pretty much ate your cpu for the first 9 months it was out. Now it just eats you cpu whenever new hardware comes out or some new game engine starts using junk that the hardware driver people haven't gotten around to fixing just right yet. Which is why I still run xp. Because 5 to 7 microsecond latency beats the crap out of 80 to 120 microsecond latency and I don't give one rats ass about directx 10 or 11 because directx 10 is pretty much nothing. 10.1 does low ball tesselation and 11 won't work worth a crap till 14 or maybe 20nm gpu's. And we were supposed to get 28nm gpu's in 2010, I mean 2011, I mean 2012.

  7. #77
    Join Date
    Sep 2011
    Posts
    101

    Default

    Quote Originally Posted by alexThunder View Post
    I'm planning to get one of those. Can anyone recommend a good AM3+ board (i.e. with fully supported onboard-audio) for Linux? Right now I have an Asus Striker Extreme (plz don't ask me why ), which actually works fine, but I always have some sound issues :/
    Unfortunately, all "onboard-audio" is crappy on every new(er) MB board for Linux, and even some in Win* !?.
    You're much better off getting a cheap (older)used PCI sound card, if you really want quality (some of these actually have really good op-amp's, and much better/stabler Drivers').
    I have had generally better quality/results using those, than these absolute shite "onboard-audio" that "ALL" MB-manufacturers stick on all their new MB's.
    Last edited by scjet; 10-14-2011 at 05:19 AM.

  8. #78
    Join Date
    Sep 2011
    Posts
    101

    Default

    now I'm just quoting almost every review out there, like Amandtexh,...
    But they all sound like Bulldozer is a confirmed flop.

    I don't mind if AMD is behind Intel a bit, becuase AMD's price (bang for buck) "ALWAYS" made up for it. But this time, apparently my Phenom II X6-1090T is better and cheaper than the newest Bulldozer respectively. wtf !?
    Well, AMD better be making some money with their GPU's/Llano's/and hopefully Trinity's. 'cause this Desktop-CPU decision for me is now over.

    Unless, of course, Michael can get a good sample Bulldozer from AMD.
    Last edited by scjet; 10-14-2011 at 05:15 AM.

  9. #79
    Join Date
    Oct 2007
    Posts
    912

    Default

    Quote Originally Posted by scjet View Post
    now I'm just quoting almost every review out there, like Amandtexh,...
    But they all sound like Bulldozer is a confirmed flop.

    I don't mind if AMD is behind Intel a bit, becuase AMD's price (bang for buck) "ALWAYS" made up for it. But this time, apparently my Phenom II X6-1090T is better and cheaper than the newest Bulldozer respectively. wtf !?
    Well, AMD better be making some money with their GPU's/Llano's/and hopefully Trinity's. 'cause this Desktop-CPU decision for me is now over.

    Unless, of course, Michael can get a good sample Bulldozer from AMD.
    We'll see about Bulldozer on servers before calling it a flop - but it's definitely looking bad on the desktop. There's some hope of performance improvements (new architecture, so drivers/schedulers/whatever still need tweaking for it), but otherwise for a desktop wait for the next set of revisions (aka Trinity).

  10. #80
    Join Date
    Oct 2011
    Posts
    2

    Default

    Quote Originally Posted by Dresdenboy View Post
    There is one French review of the FX where they used the Phoronix Test Suite (PTS in their table):
    http://www.pcinpact.com/articles/amd-fx-8150/420-5.htm
    Uh, I see. This has been linked one page before.

    Anyway - recompiled code seems to show a significant advantage for BD. Same in HT4U's cray test (German).

    BD has strong rules for instruction grouping, which might lower decode bandwidth to 1-2 inst/cycle, thus not only limiting performance of one thread but indirectly reducing decode throughput of the second thread.

    So far if AMD includes their "Branch Redirect Recovery Cache" (a ľop buffer) in Trinity, this might even help legacy code after they surpassed the decode stage bottleneck.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •