Announcement

Collapse
No announcement yet.

AMD A10-6800K Richland APU On Linux

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Wrong graph and conclusion

    As someone wrote in the wrong thread, there is a bug in the final “performance per watt” graph: the overclocked perf per watt should be lower than the other two.
    http://phoronix.com/forums/showthrea...678#post339678

    Comment


    • #17
      Yep, it seems that the "performance-per-watt" code is still wrong. Otherwise how could the overclocked chip have such good performance-per-watt values beating the stock chip by such a large difference? It consumes 45% more power than at stock speeds for a small 12% increase in performance. It should have much worse performance-per-watt. How can an article this wrong be published? No wonder no one takes phoronix seriously...

      Comment


      • #18
        Uhh yeah performance per watt, wth is that anyways. Pure energy consumptions(Wh) in cray:
        Code:
        octave:5> 160.5*33.05/3600
        ans =  1.4735
        octave:6> 114.4*39.92/3600
        ans =  1.2686
        octave:7> 112*36.89/3600
        ans =  1.1477
        So A10-6800K get the job done with least consumed energy, next is A10-5800K and the last is that overclocked 6800K.

        Comment


        • #19
          Originally posted by chris200x9 View Post
          what about iGPU?
          I take it you have never seen this site without Adblock...

          This is Phoronix, Larabel has to ad whore himself out with about a dozen say nothing articles a day. If you think the masturbatory linkback soup that is the standard Phoronix news piece is bad, try it without Adblock, they are so prevalent that the entire article is nothing but links.

          Most of the benchmark software seems to be about as useful as that crap SuperPI. The gaming benchmarks are a total joke, since he can't even be bothered to even use something like the HL2:Lost Coast benchmark, which at least was for many years a decent GPU test on Windows. That the latest Unigine benchmark runs at 6FPS no matter the GPU doesn't say much, you need something closer to real world expectations.

          Originally posted by clavko View Post
          What's the point of this? If there is any inherent value in benchmarking
          it's so that customers can evaluate performance/price/wattage ratios.
          Why is it so hard for Phoronix to get this right is beyond me. Pshaw!
          You do realize that once he does do a comparison he'll use the i7 4770K as the CPU to compete against and when he compares the GPU he'll pair it against the Iris Pro 5200, because it's so fair to expect a $150 part to beat parts in the $350 range.

          Originally posted by Calinou View Post
          A 6800K is a 5800K rebrand with slightly higher frequencies and much higher price, that's all.
          Wrong. Thats like saying the HD4890 was just an overclocked HD4870. Maybe the prices are more then $20 difference in the shitty shops in you country but around here is $130 for A10-5800K and $150 for the A10-6800K, $20 is a decent tradeoff for a solid across the board 10% increase in CPU and GPU performance, $25-30 if you are building new and make full use of the 6800K's ability to use DDR3 2133 without having to overclock, as well as if you do want to overclock you will see the GPU performance scale linearly with DDR3 speed. Though your best gaming performance is going to be from manually controlling the system's ability to downclock, you want to force the CPU to stay on full blast while gaming.

          http://www.cpu-world.com/CPUs/Bulldo...A10-5800K.html
          http://www.cpu-world.com/CPUs/Bulldo...A10-6800K.html

          Comment


          • #20
            Originally posted by benmoran View Post
            If you don't already own a 5800k however, the a10-6800k is a damn solid performer.
            Exactly, if you already have the 5800K wait for the fall and upgrade to the Kaveri equivalent to the top end APU is. They will require a new CPU socket anyways since they are the tech that the PS4 and XboxOne are based on, all info I've seen so far says that the GPU will be on par with the HD7750, that the CPU and GPU will both be able to access both the DDR3 system ram and new with this series dedicated GDDR5 ram directly. Hope like hell that all the Gallium3D OpenCL work gets hammered out over the summer.
            http://www.cpu-world.com/news_2013/2...ap_leaked.html
            http://www.xbitlabs.com/news/cpu/dis...ry_Report.html
            http://www.extremetech.com/computing...en-cpu-and-gpu
            http://www.pcper.com/reviews/Shows-a...de-Sea-Islands
            http://wccftech.com/amd-launching-28...er-cores-2013/



            The possibility of GDDR5 DIMMs to pair with the Kaveri? Hows some info from some reputable sources?
            http://www.jedec.org/standards-documents/results/GDDR5M
            http://www.hynix.com/inc/pdfDownload...hicsMemory.pdf

            Comment


            • #21
              Let me be the cynical bastard and say that the reason the iGPU wasn't tested is because Michael is willing to give AMD time to fix their fglrx to make it work with APUs ?

              I've bought an A10-6800K last week and it helped me cement my overall impression of AMD's products : superb hardware, shit drivers. Unfortunately, someone needs to tell the wonderful VHDL-wielding people at AMD that hardware is nothing without the software.

              Last year I bought a Trinity-based laptop and I still can't make it work OK-ish in anything but straw-man Phoronix 3D tests with 2000-era engines. I have to disable the crossfired dGPU in the notebook to make it work OK and, even then fglrx still has a lot of problems running anything more advanced that a community-modded Quake 3 engine.

              Similarly, I borrowed a HD7750 for the desktop A10-6800K, disable the iGPU and it seems to work ok in most Steam games ( you know, *REAL* tests with high performance, modern game engines ). Remove the dGPU, enable the iGPU in the 6800K, clean and re-install fglrx and ... kaboom ! utter shit ! corruptions in TF2, Portal, The Cave, Oil Rush ... you name it. Yeah, it works with some primitive OpenGL engine from the Phoronix test suite, but then, I would have bought a 5th hand pentium 4 with an AGP graphics card for those kinds of 3D workloads.

              Also, I removed fglrx, disabled the iGPU, slap in a geforce 450, installed the Nvidia blob and everything 'just worked'. Every single game from Steam worked flawlessly, Unity builds for Linux worked flawlessly.

              For Unity fans, make a moderately complicated scene, with FX, and export it to Linux and Mac. Run it on Linux with AMD fglrx and an APU and then on an iMac with a 6xxx AMD card. Linux fglrx sucks big salty balls, Mac driver works flawlessly.
              So much for tin-foil hat conspiracies about Nvidia-optimized games ... Clearly AMD makes more profit from Game coupon bribes for their cards ( if they are so competitive against Nvidia ( or Intel ?? ) why bother with the Game coupons ? ) and ferrying game 'journalists' to luxurious exotic islands for ( linux-incompatible ) product launches. Intel might have robbed them of their market share with bribes and the like from the 1B$ fine scandal, but, man, AMD has some shitty software developers over there ... Lemme' guess, Linux drivers are 'managed' by the Germany team, but the development is subcontracted to the Ukrainians for Luxoft ?

              This was my last AMD product purchase. I can't believe that a company this side can't get its shit together and make drivers for their own products. Night after night, after 10-12 hours at work, I waste time trying to make it work when their competitors' products work 99% out of the box.

              Rant over.

              Comment


              • #22
                Originally posted by Agross View Post
                ..........Rant over.
                It might be too late as you've already got the 450, but before you give up on that APU you should REALLY try out the latest open source drivers. I'm running an a10-5800k myself, and suffered with fglrx for a while, so I know what you're going through. The Free driver absolutely destroys FGLRX is every way possible, and will run most games at a higher frame rate. The games that technically have a lower framerate will still perform better, since there is no more stuttering or random fluxuations in fps. Valve games especially run great. A bit of warning is that the Buddah engine Double Fine uses in The Cave and Brutal Legend currently freeze due to a Mesa bug, but overall more games run fine compared to FGLRX.

                I'm running Arch, with the 3.11 kernel and 9.2 mesa. Basically just a standard install fully up to date. You will have have enable dpm with the kernel parameter, and enable the SB shader optimization. That's basically it. The APUs are easy since they require no microcode, and perform great with R600. With the next kernel release, those two steps will probably not even be necessary anymore.

                Comment


                • #23
                  I don't intend to give up on the APU, I only play games 2-4 hours at the end of the week, and that if I have the time.

                  I'm just very disappointed ... I can't believe they never tested fglrx releases on APUs. There is no other explanation why a Radeon dGPU works but the iGPU doesn't with the same driver.

                  Comment


                  • #24
                    I wonder if it's also attributed to architectural differences? Fglrx really is aweful on the APUs.

                    Either way, it doesn't matter because the Free driver is awesome now. Even VDPAU works great - much better than XVBA did on fglrx.

                    Comment

                    Working...
                    X