Announcement

Collapse
No announcement yet.

AMD FX-8350 "Vishera" Linux Benchmarks

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    While I don't really want to get into the turd fight here, but FTA, they claim, voltage was measured between the CPU and the 12V carrier going from the PSU.

    Assuming the PSU has only a single rail (which is quite common again, as the dual rail thing was put to rest not too long ago), a motherboard typically has 2 connectors.

    The ATX connector, carrying various voltages for overal motherboard usage.

    The ATX12 (I belive it is called) connector, carrying nothing but GND and 12V.

    PCI-e can ... well let me just quote wikipedia:
    PCI Express Graphics 1.0 (PEG) cards may increase power (from slot) to 75 W after configuration (3.3 V/3 A + 12 V/5.5 A).[9] PCI Express 2.1 increased the power output from an x16 slot to 150 W
    I am having serious doubts, that all that power will be routed through the default ATX connector. I believe it is quite safe to say, that the ATX connector is used for the 10 Watt and 25 Watt allowed for non graphic card connections, e.g. sound, network etc that tap into the PCI-e power.


    Now lets assume that there is only 1 graphics card installed into the system. Also assume that it is only PCI-e, 75 Watt. Combine that with the TDP of 125 of the CPU, you are already at 200 Watt, if you max it all out. 200 W @ 12 V is doing 16 amps. That's a lot of juice going through that regular ATX connector.

    So I think it is quite safe to assume, that both graphics and CPU power is being routed through that ATX12V connector. 3.3V/3A is then probably done by the same power unit on the motherboard that supplies the CPU with its 3.3V (I/O still does 3.3 I believe, i could be wrong). And if it doesn't, those 3amps can still come from ATX.

    Even using an integrated graphics card won't fix those results, as it's very likely the same power-rail is being tapped internally.

    Based on that assumption, I think it is quite safe to say, that article is could be using the wrong testing methodology.

    P.S. Please don't compare CPU's on a clock-per-clock basis. This isn't an accurate measurement whatsoever. It never was (thought it was somewhat related in the past) and it never will be (Not Ever).

    Comment


    • #17
      @oliver But they said up to 168 Watt, not all the time. So it's unclear if this was turbo core, some strange spike or even overclocking (intel sponsored test).

      Would be really great if anyone with the equipment and skills could re-test this.

      Comment


      • #18
        Originally posted by oliver View Post
        While I don't really want to get into the turd fight here, but FTA, they claim, voltage was measured between the CPU and the 12V carrier going from the PSU.
        Assuming the PSU has only a single rail (which is quite common again, as the dual rail thing was put to rest not too long ago), a motherboard typically has 2 connectors.
        The ATX connector, carrying various voltages for overal motherboard usage.
        The ATX12 (I belive it is called) connector, carrying nothing but GND and 12V.
        PCI-e can ... well let me just quote wikipedia:
        I am having serious doubts, that all that power will be routed through the default ATX connector. I believe it is quite safe to say, that the ATX connector is used for the 10 Watt and 25 Watt allowed for non graphic card connections, e.g. sound, network etc that tap into the PCI-e power.
        Now lets assume that there is only 1 graphics card installed into the system. Also assume that it is only PCI-e, 75 Watt. Combine that with the TDP of 125 of the CPU, you are already at 200 Watt, if you max it all out. 200 W @ 12 V is doing 16 amps. That's a lot of juice going through that regular ATX connector.
        So I think it is quite safe to assume, that both graphics and CPU power is being routed through that ATX12V connector. 3.3V/3A is then probably done by the same power unit on the motherboard that supplies the CPU with its 3.3V (I/O still does 3.3 I believe, i could be wrong). And if it doesn't, those 3amps can still come from ATX.
        Even using an integrated graphics card won't fix those results, as it's very likely the same power-rail is being tapped internally.
        Based on that assumption, I think it is quite safe to say, that article is could be using the wrong testing methodology.
        P.S. Please don't compare CPU's on a clock-per-clock basis. This isn't an accurate measurement whatsoever. It never was (thought it was somewhat related in the past) and it never will be (Not Ever).
        O man... you don't deal here with some website like phoronix you deal with the German Heise.de/C't

        http://de.wikipedia.org/wiki/Heise_online
        http://de.wikipedia.org/wiki/C%E2%80%99t

        These people are the most advanced experts in Germany.

        Intel lost the European anti-trust law lawsuit and paid 1,5 billion euros because of a statement these heise.de/C't experts.

        so in fact you are a FOOL if you think you can mess with these kind of experts.

        Comment


        • #19
          Well, it's a sligt improvement over Bulldozer and we knew already that wasn't competitive. Even AMD's CEO acknowledges that. So, no surprise there.

          Comment


          • #20
            Originally posted by necro-lover View Post
            so in fact you are a FOOL if you think you can mess with these kind of experts.
            This is freely translated from http://www.gamestar.de/hardware/proz...3006018,3.html
            368 Watt under heavy load
            the consumption of 368 Watt from the FX8350-Testsystem is lower than 372 Watt from the FX-8150 even if the new one uses 4.0 instead of 3.6 GHz
            So the FX 8150 uses 172 Watt? I wonder why nobody mentioned that in the past.

            Seriously: I'm sure there's something wrong at heise. And may it just be that they got a faulty CPU by accident.

            Comment


            • #21
              Originally posted by TAXI View Post
              This is freely translated from http://www.gamestar.de/hardware/proz...3006018,3.html
              So the FX 8150 uses 172 Watt? I wonder why nobody mentioned that in the past.
              Seriously: I'm sure there's something wrong at heise. And may it just be that they got a faulty CPU by accident.
              The problem is there are only some rare people who do have the skill to and the stuff to measure this.
              Also you do have a test with the complete PC measured this is complete FUD.
              A "faster" CPU can burn more power and the result can be a lower overall power consuming of the complete pc.
              This is "Paradox" but its realistic because if you are faster done with the job you can jump into a power saving mode.
              So its possible that the FX8150 burns less power at PEAK and the result is a overall higher waste of power. The fx8350 burns more power at peak but the overall power consuming is lower.

              This is a possible scenario and not a "accident"

              Comment


              • #22
                Originally posted by necro-lover View Post
                O man... you don't deal here with some website like phoronix you deal with the German Heise.de/C't

                http://de.wikipedia.org/wiki/Heise_online
                http://de.wikipedia.org/wiki/C%E2%80%99t

                These people are the most advanced experts in Germany.

                Intel lost the European anti-trust law lawsuit and paid 1,5 billion euros because of a statement these heise.de/C't experts.

                so in fact you are a FOOL if you think you can mess with these kind of experts.
                They didn't describe their methodoligy. They didn't show any graphs over time of their results. Basically, they just blurted out some random statements.. I don't care if they are the Gods of the German Ueber Empire, if they BS, they BS. That said, they could have used an ISA or PCI card and not run GPGPU or 3D tests. The OS would probably be dog slow and but wattage should be extremly low. PCI-e with very very low power usages do exists again, with the information from the article, that is all not known.

                So experts or not, they wrote a shitty claim without any proof.

                Comment


                • #23
                  Originally posted by oliver View Post
                  They didn't describe their methodoligy. They didn't show any graphs over time of their results. Basically, they just blurted out some random statements.. I don't care if they are the Gods of the German Ueber Empire, if they BS, they BS. That said, they could have used an ISA or PCI card and not run GPGPU or 3D tests. The OS would probably be dog slow and but wattage should be extremly low. PCI-e with very very low power usages do exists again, with the information from the article, that is all not known.

                  So experts or not, they wrote a shitty claim without any proof.
                  sue them in a court.. but beware heise.de/C't never lost a single sawsuite on a technical question.
                  because they REALLY did and prove what they write.

                  "I don't care if they are the Gods of the German Ueber Empire"

                  yes yes but we in germany only talk the language: lawsuit/court and amd can try to sue them.

                  they will lose anyway.

                  Comment


                  • #24
                    The nice thing about this is that you can have BOTH unlocked multiplier AND virtualization support.

                    Comment


                    • #25
                      Originally posted by Thev00d00 View Post
                      The nice thing about this is that you can have BOTH unlocked multiplier AND virtualization support.
                      no its even more you can have unlocked cpu+virtualization+ECC non-reg ram.

                      Comment


                      • #26
                        Thanks for the review! However, exactly like many in this thread already mentioned,

                        total system power consumption and powerconsumption per task measurement are missing. Those are rather critical for desktop usage...




                        Also, TDP, CPU or system drains - this details mean nothing. Only the total system consumption [CPU, memory, motherboard] plays role for desktop user. They should be competitive or matching the performance.

                        This is not Linux, but they do mess max and idle power consumption. Assuming Linux power implementation is on paar to OfftopicOS, these values should apply(clickable).
                        Last edited by crazycheese; 10-23-2012, 07:46 AM.

                        Comment


                        • #27
                          Originally posted by necro-lover View Post
                          John Doe cooling solution at home will never get any good result.

                          But hey thatís the fake world we life in.

                          In my point of view this is a ~150watt TDP cpu and the intel one 3770K is a ~100 watt TDP cpu.
                          Uhm... The Intel chip and the motherboard to run the Intel chip are *both* more expensive..

                          It's an FX CPU.. It's for enthusiasts just like the Intel Extreme edition CPUs are for enthusiasts.. If you aren't one, you don't need to buy one.. A lot of gaming PCs on Windows have water cooling, including mine. I love the fact that AMD kept socket compatibility as I haven't had to buy a new CPU waterblock since AM2 came out 6 years ago or so.

                          It's good to see if I get an FX chip, I can get some pretty sexy performance out of it on both Windows and Linux. Wish I could say the same about my watercooled SLI'd GTX 470s...

                          I was really planning on going Intel for my next desktop, but now I really think I won't. Although I will be ditching my nvidia graphics cards ASAP.

                          Comment


                          • #28
                            Originally posted by Thev00d00 View Post
                            The nice thing about this is that you can have BOTH unlocked multiplier AND virtualization support.
                            Because most servers run overclocked?
                            My 2500k can do both anyway. It's missing some hardware support for I/O virtualization (I think), but it can virtualize nonetheless.

                            Comment


                            • #29
                              Originally posted by Sidicas View Post
                              Uhm... The Intel chip and the motherboard to run the Intel chip are *both* more expensive..

                              It's an FX CPU.. It's for enthusiasts just like the Intel Extreme edition CPUs are for enthusiasts.. If you aren't one, you don't need to buy one.. A lot of gaming PCs on Windows have water cooling, including mine. I love the fact that AMD kept socket compatibility as I haven't had to buy a new CPU waterblock since AM2 came out 6 years ago or so.
                              Unless you produce your own power, the energy costs will nullify any advantage less than in 6 months if you live in europe. If you have cheap energy, then its not relevant.

                              The "enthusiasts" thing is complete BS. Its just a label, labels don't mean anything, raw benchmarks do. Labeling yourself "Bruce Lee 2" won't make you one.

                              Comment


                              • #30
                                Originally posted by necro-lover View Post
                                The problem is there are only some rare people who do have the skill to and the stuff to measure this.
                                Also you do have a test with the complete PC measured this is complete FUD.
                                A "faster" CPU can burn more power and the result can be a lower overall power consuming of the complete pc.
                                This is "Paradox" but its realistic because if you are faster done with the job you can jump into a power saving mode.
                                So its possible that the FX8150 burns less power at PEAK and the result is a overall higher waste of power. The fx8350 burns more power at peak but the overall power consuming is lower.

                                This is a possible scenario and not a "accident"
                                Thanks for explaining this, but for you accidents (like somebody at AMD didn't notice the to high watt usage) are impossible? After all there are humans involved and humans make mistakes, else we would never buy a faulty CPU.

                                Please also note that I just wanted to count things that could have happened, I never told one of them must.

                                Comment

                                Working...
                                X