Announcement

Collapse
No announcement yet.

AMD FX-8350 "Vishera" Linux Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by crazycheese View Post
    Unless you produce your own power, the energy costs will nullify any advantage less than in 6 months if you live in europe. If you have cheap energy, then its not relevant.

    The "enthusiasts" thing is complete BS. Its just a label, labels don't mean anything, raw benchmarks do. Labeling yourself "Bruce Lee 2" won't make you one.
    Show me your numbers... Right now....


    Don't spread bullshit.

    Power here in New York is much more expensive than it is in the rest of the USA.
    The CPU uses what? 48W more power?
    Using the PC 8 hours a day...

    48 * 8 = 384Whr more power per day

    30 days a month...
    384Whr * 30 days = 11.52kWh per month

    Cost per kWh here in New York...
    (https://www.nationalgridus.com/niaga...our_charge.asp)
    57 dollars for 1000kWh (1MWh).... and that's assuming I use it during the worst time of day.

    Extra electrical cost to use the AMD CPU rather than the Intel CPU per month...
    11.52kWh / 1000kWh = x/$56.63
    X=$0.65

    Whoopie, we saved 65 cents a month.


    Price difference between CPUs...
    AMD FX-8350 = $220 (NewEgg.com)
    Intel 3770K = $320 (NewEgg.com)
    Difference = $100

    How many months is it going to take to pay for the CPU price difference?
    $100 / $0.65 = 153 months

    How many years is that?
    12.75 years

    But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
    Maybe an extra 50 dollars more. So we need to add another 6 years onto that....

    Of course, I love advanced fab chips, but when companies like Intel intentionally make their chips and motherboards overpriced by so much, any kind of power savings goes flying out the window instantly.
    Last edited by Sidicas; 23 October 2012, 09:05 AM.

    Comment


    • #32
      Originally posted by necro-lover View Post
      heise.de: AMD's FX-8350 125Watt TDP pure fake number 168 watts measured

      http://www.heise.de/newsticker/meldu...i-1734298.html

      AMD is just try to fool us.
      I don't think you're understanding what you're reading.
      Peak power is worthless for anything other than determining what power supply to buy...
      You cannot use peak power to make comments about how much power the PC uses over *ANY* period of time.
      TDP, by definition is the expected amount of power the CPU uses over a period of time.


      Legitreviews says that the average load power of a PC with the 8350 is 219 watts compared to 163 watts of the Intel chip while running CPU benchmarks..
      Which is exactly what the TDP difference between the chips predicts.. Running games, the difference is even less.
      Source link here...


      You know it takes a *LOT* of power to turn over a car engine but that doesn't mean you drive your car with your starter running 24/7. Peak power is an almost worthless measurement, especially for a CPU that can ramp up and down within time frames as small as 1/1000000 of a second.
      Last edited by Sidicas; 23 October 2012, 09:03 AM.

      Comment


      • #33
        Pay 45% more for a 3770k CPU that gives nearly the same performance (and is slower in some cases) just to save a few dozen watts? No thanks Intel.

        Comment


        • #34
          Originally posted by Sidicas View Post
          Show me your numbers... Right now....
          57 dollars for 1000kWh (1MWh).... and that's assuming I use it during the worst time of day.
          But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
          Maybe an extra 50 dollars more. So we need to add another 6 years onto that....
          I pay $200 for 1000 kWh (without 20% tax).
          $50 motherboard support Intel 3770K.

          Comment


          • #35
            Originally posted by Sidicas View Post
            Show me your numbers... Right now....


            Don't spread bullshit.

            ...

            How many years is that?
            12.75 years

            But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
            Maybe an extra 50 dollars more. So we need to add another 6 years onto that....

            Of course, I love advanced fab chips, but when companies like Intel intentionally make their chips and motherboards overpriced by so much, any kind of power savings goes flying out the window instantly.
            Nice attempt, but it's flawed. You assume both intel and AMD take the same time to complete the tasks at hand. They don't, AMD will burn power for longer in most cases.
            You also simplified by assuming continuous full power; if we consider a mix of idle, light load and heavy load, we might have something more realistic (and quite probably beyond the arithmetics anyone on this forum is willing to do ).

            Comment


            • #36
              Let me see if I understand correctly.

              i7 3770 is essentially Intel's flagship processor, and the FX-8350 beats it in some benchmarks? Even if it does lose overall (which it does) and gets trounced in others (which it also does), this is excellent news for AMD. Coupled with a much better GPU, it's a viable option again.

              Comment


              • #37
                BTW, props for a good article. I'm often critical of poor reporting on phoronix, but articles like this offer real value!

                Comment


                • #38
                  Originally posted by pingufunkybeat View Post
                  Let me see if I understand correctly.

                  i7 3770 is essentially Intel's flagship processor, and the FX-8350 beats it in some benchmarks? Even if it does lose overall (which it does) and gets trounced in others (which it also does), this is excellent news for AMD. Coupled with a much better GPU, it's a viable option again.
                  Indeed, this does put AMD in a better position than before. But, and this is a big but, AMD's wins are based on a higher clock speed, higher TDP and a die which is twice the size of Ivy Bridge. The latter means that while AMD may be pricing these aggresively, their profits are still hurting.

                  Comment


                  • #39
                    Originally posted by bug77 View Post
                    Indeed, this does put AMD in a better position than before. But, and this is a big but, AMD's wins are based on a higher clock speed, higher TDP and a die which is twice the size of Ivy Bridge. The latter means that while AMD may be pricing these aggresively, their profits are still hurting.
                    This is true, but it also shows that the new architecture is not the fail it originally seemed, and that once the growing pains are taken care of, they might have a competitive platform again.

                    I'm guessing that moving to a smaller process is really what they need now. How realistic this is, I don't know. Didn't Global Foundries lauch a 28nm fab this year?

                    Comment


                    • #40
                      Originally posted by Sidicas View Post
                      Show me your numbers... Right now....


                      Don't spread bullshit.

                      Power here in New York is much more expensive than it is in the rest of the USA.
                      The CPU uses what? 48W more power?
                      Using the PC 8 hours a day...

                      48 * 8 = 384Whr more power per day

                      30 days a month...
                      384Whr * 30 days = 11.52kWh per month

                      Cost per kWh here in New York...
                      (https://www.nationalgridus.com/niaga...our_charge.asp)
                      57 dollars for 1000kWh (1MWh).... and that's assuming I use it during the worst time of day.

                      Extra electrical cost to use the AMD CPU rather than the Intel CPU per month...
                      11.52kWh / 1000kWh = x/$56.63
                      X=$0.65

                      Whoopie, we saved 65 cents a month.


                      Price difference between CPUs...
                      AMD FX-8350 = $220 (NewEgg.com)
                      Intel 3770K = $320 (NewEgg.com)
                      Difference = $100

                      How many months is it going to take to pay for the CPU price difference?
                      $100 / $0.65 = 153 months

                      How many years is that?
                      12.75 years

                      But wait, there's more... Intel motherboards are generally more expensive than AMD boards as well.
                      Maybe an extra 50 dollars more. So we need to add another 6 years onto that....

                      Of course, I love advanced fab chips, but when companies like Intel intentionally make their chips and motherboards overpriced by so much, any kind of power savings goes flying out the window instantly.
                      110 vs 223 Watts system consumption without graphics card.
                      113 Watts difference or 0.113 kWh more if burning it for one hour.
                      Cheapest 1kWh in europe = 0.25 eurocent = 0.50 dollar cents.

                      Using 8 hours/day (assuming full load is 2 hour, the rest 6 hours is idle) - 0,226 kWh / day extra.

                      2 hours load, every day, for six months: 0,113 x 2 x 30 x 6 x 0,25 = 10 € extra = 20$ extra in six months
                      or
                      8 hours load, every day, for six months (your model): 0,113 x 8 x 30 x 6 x 0,25 = 40 € extra = 80$ extra in six months
                      or
                      If you use it 24/7 under full load for six months: 0,113 x 24 x 30 x 6 x 0,25 = 122€ extra = 244$ extra in six months
                      This means Intel config pays 244$, amd pays 488$ for electricity bill in six months.

                      This is assuming idle use is same - we see very good progress here, kudos to AMD!
                      Due to this we discard idle running costs, because AMD machine uses only 10% more of Intel when Idling (this is OK). We calculate only extra costs.

                      Because the previous generation (I have that - Athlon II x4) had HUGE idle consumption difference (120 Watts compared to 60).
                      Lowering idle to ~level of competition is more important in desktop market. If your system consumes 60Watts more constantly when its up, its more fatal than say 60 or 100 Watts more when full load in this market.
                      For high-performance market its opposite, there full load plays vital role. Currently its not shining...

                      This assumes PSU PFC at 1 & 100% efficiency -> normal deviation plus 20% $ more (Bronze & Silver)

                      And it does not include GPU/APU, since we assume them to be the same, ofc.

                      Also, small unrelated note: Higher Hz and higher consumption result in higher wear-out levels. AMD has higher Hz and higher consumption for same performance.
                      Means, AMD system will fail at much more probable rate than Intel assuming the quality is same and technology level is equal when it comes to wear-out.

                      Originally posted by Sidicas View Post
                      57 dollars for 1000kWh
                      0,057c = 0,029 eurocent
                      Thats criminally cheap or wrong. Or both.
                      Last edited by crazycheese; 23 October 2012, 11:01 AM.

                      Comment

                      Working...
                      X