Announcement

Collapse
No announcement yet.

AMD A10-6800K Richland APU On Linux

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by liamdawe View Post
    As I thought, no need to "upgrade" from my 5800k yet then! Maybe the next revision will be a reason! Still 5800k gives decent performance for me!

    Liam of GamingOnLinux.com
    A 6800K is a 5800K rebrand with slightly higher frequencies and much higher price, that's all.

    Comment


    • #12
      Originally posted by phoronix View Post
      Phoronix: AMD A10-6800K Richland APU On Linux

      Earlier this month AMD unveiled their Richland desktop APUs as an upgraded version of Trinity. While still based upon Piledriver CPU cores, the AMD A10-6800K APU under Linux is a modest upgrade until the arrival of the Jaguar-based APUs. For starting off our Linux testing of the A10-6800K are Ubuntu Linux benchmarks of this high-end Richland APU compared against the A10-5800K Trinity APU.

      http://www.phoronix.com/vr.php?view=18837
      " until the arrival of the Jaguar-based APUs." Ahem Jaguar based apus are for mobile platform(kabini), and not upgrade for Richland. That update is called kaveri and cpu cores are called steamroller.

      Comment


      • #13
        I'm not even suprised anymore. It's a classical Phoronix benchmark
        If it's an nVidia benchmark, there are no ATi Cards, if it's an Intel
        integrated graphics, they're not compared to ATi's APU's, and if it's
        an AMD CPU you can be damn sure there won't be any Intel processor
        thrown in for a good measure.

        What's the point of this? If there is any inherent value in benchmarking
        it's so that customers can evaluate performance/price/wattage ratios.
        Why is it so hard for Phoronix to get this right is beyond me. Pshaw!

        Comment


        • #14
          Originally posted by clavko View Post
          I'm not even suprised anymore. It's a classical Phoronix benchmark
          If it's an nVidia benchmark, there are no ATi Cards, if it's an Intel
          integrated graphics, they're not compared to ATi's APU's, and if it's
          an AMD CPU you can be damn sure there won't be any Intel processor
          thrown in for a good measure.

          What's the point of this? If there is any inherent value in benchmarking
          it's so that customers can evaluate performance/price/wattage ratios.
          Why is it so hard for Phoronix to get this right is beyond me. Pshaw!
          Ads revenues.

          You will find Intel CPU benchmarks soon. And You will be able to compare them directly... You just wont find them on same graph. (Untill Michael do some bigger benchmark...)

          Comment


          • #15
            Originally posted by Calinou View Post
            A 6800K is a 5800K rebrand with slightly higher frequencies and much higher price, that's all.
            No it's not. There were some die optimizations done. At the same clock speeds (overclocked 5800k), the a10-6800k draws less power. Comparing stock for stock, you get (slightly) better performance with the same power usage. Do a google search and you can confirm this via various tests.

            That said, it's probably not worth the upgrade from the a10-5800k. If you don't already own a 5800k however, the a10-6800k is a damn solid performer.

            Comment


            • #16
              Wrong graph and conclusion

              As someone wrote in the wrong thread, there is a bug in the final “performance per watt” graph: the overclocked perf per watt should be lower than the other two.
              http://phoronix.com/forums/showthrea...678#post339678

              Comment


              • #17
                Yep, it seems that the "performance-per-watt" code is still wrong. Otherwise how could the overclocked chip have such good performance-per-watt values beating the stock chip by such a large difference? It consumes 45% more power than at stock speeds for a small 12% increase in performance. It should have much worse performance-per-watt. How can an article this wrong be published? No wonder no one takes phoronix seriously...

                Comment


                • #18
                  Uhh yeah performance per watt, wth is that anyways. Pure energy consumptions(Wh) in cray:
                  Code:
                  octave:5> 160.5*33.05/3600
                  ans =  1.4735
                  octave:6> 114.4*39.92/3600
                  ans =  1.2686
                  octave:7> 112*36.89/3600
                  ans =  1.1477
                  So A10-6800K get the job done with least consumed energy, next is A10-5800K and the last is that overclocked 6800K.

                  Comment


                  • #19
                    Originally posted by chris200x9 View Post
                    what about iGPU?
                    I take it you have never seen this site without Adblock...

                    This is Phoronix, Larabel has to ad whore himself out with about a dozen say nothing articles a day. If you think the masturbatory linkback soup that is the standard Phoronix news piece is bad, try it without Adblock, they are so prevalent that the entire article is nothing but links.

                    Most of the benchmark software seems to be about as useful as that crap SuperPI. The gaming benchmarks are a total joke, since he can't even be bothered to even use something like the HL2:Lost Coast benchmark, which at least was for many years a decent GPU test on Windows. That the latest Unigine benchmark runs at 6FPS no matter the GPU doesn't say much, you need something closer to real world expectations.

                    Originally posted by clavko View Post
                    What's the point of this? If there is any inherent value in benchmarking
                    it's so that customers can evaluate performance/price/wattage ratios.
                    Why is it so hard for Phoronix to get this right is beyond me. Pshaw!
                    You do realize that once he does do a comparison he'll use the i7 4770K as the CPU to compete against and when he compares the GPU he'll pair it against the Iris Pro 5200, because it's so fair to expect a $150 part to beat parts in the $350 range.

                    Originally posted by Calinou View Post
                    A 6800K is a 5800K rebrand with slightly higher frequencies and much higher price, that's all.
                    Wrong. Thats like saying the HD4890 was just an overclocked HD4870. Maybe the prices are more then $20 difference in the shitty shops in you country but around here is $130 for A10-5800K and $150 for the A10-6800K, $20 is a decent tradeoff for a solid across the board 10% increase in CPU and GPU performance, $25-30 if you are building new and make full use of the 6800K's ability to use DDR3 2133 without having to overclock, as well as if you do want to overclock you will see the GPU performance scale linearly with DDR3 speed. Though your best gaming performance is going to be from manually controlling the system's ability to downclock, you want to force the CPU to stay on full blast while gaming.

                    http://www.cpu-world.com/CPUs/Bulldo...A10-5800K.html
                    http://www.cpu-world.com/CPUs/Bulldo...A10-6800K.html

                    Comment


                    • #20
                      Originally posted by benmoran View Post
                      If you don't already own a 5800k however, the a10-6800k is a damn solid performer.
                      Exactly, if you already have the 5800K wait for the fall and upgrade to the Kaveri equivalent to the top end APU is. They will require a new CPU socket anyways since they are the tech that the PS4 and XboxOne are based on, all info I've seen so far says that the GPU will be on par with the HD7750, that the CPU and GPU will both be able to access both the DDR3 system ram and new with this series dedicated GDDR5 ram directly. Hope like hell that all the Gallium3D OpenCL work gets hammered out over the summer.
                      http://www.cpu-world.com/news_2013/2...ap_leaked.html
                      http://www.xbitlabs.com/news/cpu/dis...ry_Report.html
                      http://www.extremetech.com/computing...en-cpu-and-gpu
                      http://www.pcper.com/reviews/Shows-a...de-Sea-Islands
                      http://wccftech.com/amd-launching-28...er-cores-2013/



                      The possibility of GDDR5 DIMMs to pair with the Kaveri? Hows some info from some reputable sources?
                      http://www.jedec.org/standards-documents/results/GDDR5M
                      http://www.hynix.com/inc/pdfDownload...hicsMemory.pdf

                      Comment

                      Working...
                      X