Announcement

Collapse
No announcement yet.

AMD Radeon R9 285 Tonga Performance On Linux

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    wait WHAT? Micheal, do you have Nvidia's hand up your ass or something? I could barely read past page 5 because you make it so obvious you prefer NVidia over AMD it's not even funny.

    In the GPUTest Synthetic results, the R9 285 was BETTER THAN THE GTX 770 in two of the tests, that's better than the card ABOVE the one it's competing against, and it only "failed" against it's competitor in one of the other two... yet you wrote this:
    The GpuTest synthetic benchmark results were mixed but the R9 285 was roughly holding its ground against the GTX 760 on Ubuntu Linux.

    "roughly holding it's ground". WHAT? That's such bullshit. I think I'm going to re-enable adblock until you learn how to NOT choose sides when doing benchmarks, which are supposed to be objective.

    P.S. It completely destroyed it's competition in temperature and power consumption in most (I think it only lost in one?) test, and all you have to say is "it could be different for you"? Jesus Christ, man.

    Comment


    • #12
      Originally posted by Daktyl198 View Post
      wait WHAT? Micheal, do you have Nvidia's hand up your ass or something? I could barely read past page 5 because you make it so obvious you prefer NVidia over AMD it's not even funny.

      In the GPUTest Synthetic results, the R9 285 was BETTER THAN THE GTX 770 in two of the tests, that's better than the card ABOVE the one it's competing against, and it only "failed" against it's competitor in one of the other two... yet you wrote this:
      The GpuTest synthetic benchmark results were mixed but the R9 285 was roughly holding its ground against the GTX 760 on Ubuntu Linux.

      "roughly holding it's ground". WHAT? That's such bullshit. I think I'm going to re-enable adblock until you learn how to NOT choose sides when doing benchmarks, which are supposed to be objective.

      P.S. It completely destroyed it's competition in temperature and power consumption in most (I think it only lost in one?) test, and all you have to say is "it could be different for you"? Jesus Christ, man.
      Yeah, that's why I look at the numbers first, and then look at the comment for amusement.
      Gotta take everything with a grain of salt ...

      Comment


      • #13
        Originally posted by Daktyl198 View Post
        wait WHAT? Micheal, do you have Nvidia's hand up your ass or something? I could barely read past page 5 because you make it so obvious you prefer NVidia over AMD it's not even funny.
        I hope not but it's not unlike Michael to tone down AMD's successes while cheering NVIDIA's. Nevermind botching first impression benchmarks on AMD's product launches and correcting it later when the damage is already done.

        Nah, it's more likely I'm biased and all these are just conincidences.

        Comment


        • #14
          Questionable benchmarking?

          I want to voice a concern.
          All these binary drivers... they are known to come with a lot of "optimizations" -> cheats when it comes to common benchmarks. They will detect benchmarks running and then cheat and tweak and turn the screws just to get better values in benchmarks.
          So these figures here might be a first impression of things but I wouldn't be too sure about things.

          In other thoughts: Is there really a difference between 250 and 300 fps? If calculations in OpenCL finish 2 minutes earlier, that might make a difference when it sums up over many calculations to hours. But fps at that relation?
          Stop TCPA, stupid software patents and corrupt politicians!

          Comment


          • #15
            Originally posted by CrystalGamma View Post
            Is that so? Then I'll need to decide whether I'll buy an all-out DDR4 system with Carrizo or just an GraCa upgrade to Tonga (and then later buy an efficient DDR4 CPU/MBoard) once AMDGPU comes out.
            Depends on the power efficiency of those Excavator Cores (I'm not getting my hopes up).

            CARRIZO iGPU and Memory:

            CARRIZO will use DDR3 at 2133(2400)MHz and the max number of shaders will still be 512 like in Kaveri.

            That doesn't mean that there isn't a significant improvement in games...there will be.

            Like i said, that significant improvement will be done increasing bandwidth using same system of the R285 dGPU, by new delta lossless color compression that will "increase" in practice memory bandwidth by up to 40%.
            In practice this will be better than using DDR4 at 2700+ MHz.

            IMHO, i consider this overall good news as for CARRIZO iGPU performance goes.

            The iGPU will also be GCN1.2, Dx11.2, Dx12 and OGL4.3 compatible and will support playing videos at 4K 30FPS,etc.



            CARRIZO CPU:
            Excavator cores per se won't increase CPU performance because all the gains will be absorbed by decrease of frequencies to achieve 65W in the Quad Core models.

            This is actually good news.
            For everyday use, a dual core works so, a quad core with slight less frequencies than Kaveri will work just fine in regular applications (it will also be adequate for games because games above all better iGPU witch it will have like i said above).

            There is however MUCH better news as for CARRIZO CPUs goes:

            The production process will move away from the crappy process used in Kaveri and will use same 28nm but will be GF28A BULK silicon (basically same kind of Silicon used in Trinity/Richland but in 28nm as opposite of Kaveri that used same silicon of dGPUs) .

            The advantage to use this type of silicon is that is MUCH MORE friendly to OC (as opposite to silicon node type used in Kaveri/dGPUs that is NOT very friendly to OC) besides other issues.

            So, even if they cap lower a bit to stock frequencies, it will be easier with air cooling OC it

            Notice that for same frequencies, AMD reduced *power drain leaks* from up to 30% (dunno about overall power drain but if TDP gone from 95W to 65W...do the math )

            Comment


            • #16
              Originally posted by przemoli View Post
              LOL Witcher 2 have BAD gfx?

              Man did You SEE the game?
              Anyway Witcher 3 is just around the corner (Q1 2015)...
              I was suggesting games that had better graphics than CS Source and Xontonic and that were available on Linux. Nowhere did I say Witcher 2 had bad graphics. In fact I'd say Witcher 2 is probably in the top 5 most beautiful games out there right now which is impressive considering it's DX9.
              Last edited by PublicNuisance; 15 October 2014, 01:31 PM. Reason: spelling

              Comment


              • #17
                The big deal about R9 285 won't be outright performance increase. The big improvement will be reduced TDP and memory bandwidth requirements, which will make the new APUs not require $300 RAM and 100+W to beat Intel's latest integrated graphics.

                Oh, what cooler was the GTX 760 using? Reference? The R9 285 tested wasn't.

                Comment


                • #18
                  I'm considering this card but it seems to be a hard sell compared to the 270X and the 280X.

                  I want the 285 because it is the newest architecture, therefore should have better compatibility with openGL next and DX12, and has a more efficient design (particularly in memory bandwidth). But, the problem with the 285 is it only has 2GB of VRAM. Another concern is Tonga is GCN 1.2 - the R9 300 series might upgrade to GCN 1.3 or 2.0, which leaves Tonga as a one-of-a-kind GPU, and if that happens it is prone to be "forgotten" by AMD. In other words, it might end up with poor support.

                  The 270X is appealing because there are 3GB and 4GB models, it is a bit cheaper than the 285, and at least in linux tests it is almost on par with the 285. However, IF GCN 1.2 is supported in the R9 300 series, the 285 (in time) will get enough optimizations to spread the gap between the 270X. I don't want the 270X because it uses GCN 1.0, which might have issues with OGLN or DX12.

                  The 280X is appealing because it is overall better than the 285 for roughly the same price point. But it too uses GCN 1.0 and it has a pretty high failure rate, particularly the pre-overclocked models.


                  I would be interested in getting a Maxwell GPU if they weren't so expensive. Any recommendations?

                  Comment


                  • #19
                    750ti?

                    Comment

                    Working...
                    X