Announcement

Collapse
No announcement yet.

AMD FX-8350 "Vishera" Linux Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by necro-lover View Post
    that?s just cheating in benchmarks because you only get the result with some nasty tricks means cheat cooling solutions.

    John Doe cooling solution at home will never get any good result.

    But hey that?s the fake world we life in.

    In my point of view this is a ~150watt TDP cpu and the intel one 3770K is a ~100 watt TDP cpu.

    Now you can save money on the CPU and then pay more power bill or you spend more money on the CPU and save on power bull in the end you pay exact the same.

    There is no competition at all !
    No, a 125 Watt TDP CPU is a 125 Watt TDP CPU. It's part of the specification, and the CPU will be fine with a 125 Watt cooling solution, and both AMD and Intel guarantee stable operation at the base frequency. Turbo *always* is dependant on the exact setup the CPU is operating in. And yes, this uncertainty in benchmarking was already critizied back when Intel introduced Turbo a few years ago. Also note that Turbo usually works fine with the stock cooling solution provided in boxed sets, Intel or AMD. You usually cannot do worse than that.

    Comment


    • #12
      Originally posted by necro-lover View Post
      heise.de: AMD's FX-8350 125Watt TDP pure fake number 168 watts measured

      http://www.heise.de/newsticker/meldu...i-1734298.html

      AMD is just try to fool us.
      I guess you might not be too smart, if you do not understand that the power measured is referred to the total system and not just the cpu itself..

      Comment


      • #13
        Originally posted by sonnet View Post
        I guess you might not be too smart, if you do not understand that the power measured is referred to the total system and not just the cpu itself..
        No, this is measured on the 12 Volt CPU supply line. It includes only the CPU and voltage regulators, so not *everything* is CPU power consumption (the voltage regulators do not ooperate at 100% efficiency), but certainly most is.

        Comment


        • #14
          Originally posted by SavageX View Post
          No, this is measured on the 12 Volt CPU supply line. It includes only the CPU and voltage regulators, so not *everything* is CPU power consumption (the voltage regulators do not ooperate at 100% efficiency), but certainly most is.
          don't even try to talk smart to a human who use the word "smart" to insult another human:

          sonnet:"I guess you might not be too smart, if you do not understand that the power measured is referred to the total system and not just the cpu itself.. "

          FAIL... hey I'm maybe not so smart you expect but in fact you are Stupid! (this is a statement of fact not a insult)

          Comment


          • #15
            Originally posted by SavageX View Post
            No, a 125 Watt TDP CPU is a 125 Watt TDP CPU. It's part of the specification, and the CPU will be fine with a 125 Watt cooling solution, and both AMD and Intel guarantee stable operation at the base frequency. Turbo *always* is dependant on the exact setup the CPU is operating in. And yes, this uncertainty in benchmarking was already critizied back when Intel introduced Turbo a few years ago. Also note that Turbo usually works fine with the stock cooling solution provided in boxed sets, Intel or AMD. You usually cannot do worse than that.
            The real point is the Intel Core i7-3770K do have a TDP: 77W and a TURBO peak load of maybe 100 Watt.
            In fact the intel cpu is the "better" cpu. it also clocks at 3,5ghz vs 4-4,6ghz on the amd side.
            imagine what is if intel clocks he cpu at 4ghz and ~125watt TDP...

            AMD's 8350 only win some benchmarks because this cpu burns the energy away like a barbecue broiler.

            Comment


            • #16
              While I don't really want to get into the turd fight here, but FTA, they claim, voltage was measured between the CPU and the 12V carrier going from the PSU.

              Assuming the PSU has only a single rail (which is quite common again, as the dual rail thing was put to rest not too long ago), a motherboard typically has 2 connectors.

              The ATX connector, carrying various voltages for overal motherboard usage.

              The ATX12 (I belive it is called) connector, carrying nothing but GND and 12V.

              PCI-e can ... well let me just quote wikipedia:
              PCI Express Graphics 1.0 (PEG) cards may increase power (from slot) to 75 W after configuration (3.3 V/3 A + 12 V/5.5 A).[9] PCI Express 2.1 increased the power output from an x16 slot to 150 W
              I am having serious doubts, that all that power will be routed through the default ATX connector. I believe it is quite safe to say, that the ATX connector is used for the 10 Watt and 25 Watt allowed for non graphic card connections, e.g. sound, network etc that tap into the PCI-e power.


              Now lets assume that there is only 1 graphics card installed into the system. Also assume that it is only PCI-e, 75 Watt. Combine that with the TDP of 125 of the CPU, you are already at 200 Watt, if you max it all out. 200 W @ 12 V is doing 16 amps. That's a lot of juice going through that regular ATX connector.

              So I think it is quite safe to assume, that both graphics and CPU power is being routed through that ATX12V connector. 3.3V/3A is then probably done by the same power unit on the motherboard that supplies the CPU with its 3.3V (I/O still does 3.3 I believe, i could be wrong). And if it doesn't, those 3amps can still come from ATX.

              Even using an integrated graphics card won't fix those results, as it's very likely the same power-rail is being tapped internally.

              Based on that assumption, I think it is quite safe to say, that article is could be using the wrong testing methodology.

              P.S. Please don't compare CPU's on a clock-per-clock basis. This isn't an accurate measurement whatsoever. It never was (thought it was somewhat related in the past) and it never will be (Not Ever).

              Comment


              • #17
                @oliver But they said up to 168 Watt, not all the time. So it's unclear if this was turbo core, some strange spike or even overclocking (intel sponsored test).

                Would be really great if anyone with the equipment and skills could re-test this.

                Comment


                • #18
                  Originally posted by oliver View Post
                  While I don't really want to get into the turd fight here, but FTA, they claim, voltage was measured between the CPU and the 12V carrier going from the PSU.
                  Assuming the PSU has only a single rail (which is quite common again, as the dual rail thing was put to rest not too long ago), a motherboard typically has 2 connectors.
                  The ATX connector, carrying various voltages for overal motherboard usage.
                  The ATX12 (I belive it is called) connector, carrying nothing but GND and 12V.
                  PCI-e can ... well let me just quote wikipedia:
                  I am having serious doubts, that all that power will be routed through the default ATX connector. I believe it is quite safe to say, that the ATX connector is used for the 10 Watt and 25 Watt allowed for non graphic card connections, e.g. sound, network etc that tap into the PCI-e power.
                  Now lets assume that there is only 1 graphics card installed into the system. Also assume that it is only PCI-e, 75 Watt. Combine that with the TDP of 125 of the CPU, you are already at 200 Watt, if you max it all out. 200 W @ 12 V is doing 16 amps. That's a lot of juice going through that regular ATX connector.
                  So I think it is quite safe to assume, that both graphics and CPU power is being routed through that ATX12V connector. 3.3V/3A is then probably done by the same power unit on the motherboard that supplies the CPU with its 3.3V (I/O still does 3.3 I believe, i could be wrong). And if it doesn't, those 3amps can still come from ATX.
                  Even using an integrated graphics card won't fix those results, as it's very likely the same power-rail is being tapped internally.
                  Based on that assumption, I think it is quite safe to say, that article is could be using the wrong testing methodology.
                  P.S. Please don't compare CPU's on a clock-per-clock basis. This isn't an accurate measurement whatsoever. It never was (thought it was somewhat related in the past) and it never will be (Not Ever).
                  O man... you don't deal here with some website like phoronix you deal with the German Heise.de/C't

                  http://de.wikipedia.org/wiki/Heise_online
                  http://de.wikipedia.org/wiki/C%E2%80%99t

                  These people are the most advanced experts in Germany.

                  Intel lost the European anti-trust law lawsuit and paid 1,5 billion euros because of a statement these heise.de/C't experts.

                  so in fact you are a FOOL if you think you can mess with these kind of experts.

                  Comment


                  • #19
                    Well, it's a sligt improvement over Bulldozer and we knew already that wasn't competitive. Even AMD's CEO acknowledges that. So, no surprise there.

                    Comment


                    • #20
                      Originally posted by necro-lover View Post
                      so in fact you are a FOOL if you think you can mess with these kind of experts.
                      This is freely translated from http://www.gamestar.de/hardware/proz...3006018,3.html
                      368 Watt under heavy load
                      the consumption of 368 Watt from the FX8350-Testsystem is lower than 372 Watt from the FX-8150 even if the new one uses 4.0 instead of 3.6 GHz
                      So the FX 8150 uses 172 Watt? I wonder why nobody mentioned that in the past.

                      Seriously: I'm sure there's something wrong at heise. And may it just be that they got a faulty CPU by accident.

                      Comment

                      Working...
                      X