Announcement

Collapse
No announcement yet.

AMD APU On Linux: Gallium3D Can Be 80%+ As Fast As Catalyst

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Pawlerson View Post
    Is this true with nvidia?
    Sorry, I don't know. Should've said "all AMD Linux drivers".

    Comment


    • #32
      Originally posted by oibaf View Post
      Nice article, thanks !

      ... and before anyone asking about llvm 3.4 (mesa c4cf48 used in the test was compiled with 3.3), see the discussion here: http://phoronix.com/forums/showthrea...-Upgrades-Easy
      So, wouldn't it be even faster with gcc? (llvm compiles faster, but gcc produces faster code)

      Comment


      • #33
        Originally posted by Pawlerson View Post
        Is this true with nvidia?
        I think it is too, but it's not like you'll be below 60 FPS with most modern graphic cards. It is more CPU than GPU dependent.

        Comment


        • #34
          Originally posted by ua=42 View Post
          Ah yes, the i7-4950hq, which costs $750, is slightly faster than the AMD A10 which costs $99. So yeah. If money is no object, you can get an intel system that can beat AMD's APU graphically, but if you are on a budget, you can get something that is almost as fast for $650 less.
          For computational workloads (including opencl) the Intel was much faster, not just a little faster. Graphically it was only a little faster iirc.

          Comment


          • #35
            Originally posted by carewolf View Post
            So, wouldn't it be even faster with gcc? (llvm compiles faster, but gcc produces faster code)
            Mesa requires llvm libraries (for llvmpipe and radeonsi, for example), but all the code is compiled with gcc only.

            Comment


            • #36
              Originally posted by Calinou View Post
              I think it is too, but it's not like you'll be below 60 FPS with most modern graphic cards. It is more CPU than GPU dependent.
              Yeah, you're right. However I experience strange stuttering with nvidia drivers in Valve games. I had similar experience with catalyst, but never with Open Source radeon drivers.

              Comment


              • #37
                Originally posted by liam View Post
                For computational workloads (including opencl) the Intel was much faster, not just a little faster. Graphically it was only a little faster iirc.
                Ah. I thought we were talking about graphics. Not opencl. Two very different use cases.

                Comment


                • #38
                  Originally posted by ua=42 View Post
                  I understand the wanting a longer battery life. But for me I bought an a4 laptop and spent an extra $100 for an extended battery and when I'm browsing the net or typing I can get 8hours of run time. If i'm playing 3d games I get 4 hours.

                  I'm kind of curious how many more hours the intel gets for the $$$$ I saved.
                  Not much, most of the energy consumed in a laptop is actually from the display.

                  Comment


                  • #39
                    Originally posted by mmstick View Post
                    Not much, most of the energy consumed in a laptop is actually from the display.
                    Actually a lot. I have a Thinkpad T420 with Optimimus. It gets 7hours on battery in Windows and about 2hours in Linux by default. The main difference is that the GPU is not throttled down in Linux. If I force the GPU off I get 5 hours on battery in Linux. So disabling the Nvidia GPU (that wasn't even used), makes the battery life go from 2hours to 5. It is a MAJOR drain and makes bigger impact than turning off the screen.

                    Comment


                    • #40
                      Originally posted by liam View Post
                      For computational workloads (including opencl) the Intel was much faster, not just a little faster. Graphically it was only a little faster iirc.
                      If you want to drop $750 on a CPU. Most won't.

                      Comment


                      • #41
                        Originally posted by mmstick View Post
                        Not much, most of the energy consumed in a laptop is actually from the display.
                        That's not necessarily true, especially when idling (but with screen on). cpus have gotten so much better about getting to lower, and lower energy states. Looking forward to the, hopefully not too far off, day when we cpus can reliably call upon near threshold voltages.

                        Comment


                        • #42
                          Originally posted by Sonadow View Post
                          off-topic: How the heck did intel suddenly catch up so quickly in the graphics department still remains a mystery.
                          It wasn't quick at all. They've been trying for 5 years now, and still haven't really caught up, they're just in the competitive neighborhood now.

                          Comment


                          • #43
                            Originally posted by smitty3268 View Post
                            It wasn't quick at all. They've been trying for 5 years now, and still haven't really caught up, they're just in the competitive neighborhood now.
                            I wouldn't really call "We brute forced the problem in order so that we can finally try to say that we beat AMD with extremely high bin limited run chips designed solely for the purpose, while our mainstream GPU is still actually far behind" and basically failing at that being "in the competitive neighborhood" even.

                            Comment


                            • #44
                              Originally posted by Luke_Wolf View Post
                              I wouldn't really call "We brute forced the problem in order so that we can finally try to say that we beat AMD with extremely high bin limited run chips designed solely for the purpose, while our mainstream GPU is still actually far behind" and basically failing at that being "in the competitive neighborhood" even.
                              Using edRAM is probably a fairly 'balanced' thing for Intel since they have to rely solely on DDR3 system RAM, meanwhile AMD's cards have 1-2Gbs of GDDR5 to play with. And while Intel's "mainstream" GPU isn't the greatest (we'll see how things are with Broadwell vs Kavari, Broadwell's supposed to do to the GPU what Haswell did for power efficiency.) it is definitely very usable for the average user, even a light-gaming user, or a movie-watcher. Hell I'm using -Sandy Bridge- to run 1080p flash video and its perfectly fine. I would LOVE to have Broadwell or Skylake in my next laptop...

                              Comment


                              • #45
                                Originally posted by Luke_Wolf View Post
                                I wouldn't really call "We brute forced the problem in order so that we can finally try to say that we beat AMD with extremely high bin limited run chips designed solely for the purpose, while our mainstream GPU is still actually far behind" and basically failing at that being "in the competitive neighborhood" even.
                                I disagree. "Brute forcing" the problem is exactly what they needed to do. It's what their competitors have done - if you buy an AMD or NVidia part, you're buying billions of transistors. Intel always tried to go on the cheap, and they were never going to get better performance until they spent the necessary die space required.

                                And their 4600 GPU is still in the neighborhood of being competitive with the extreme low end parts they are competing with. No gamer would ever want them, but someone who plays a bit of the Sims occasionally would be OK. We'll see where they go from here, but for the last 5 years they've gone from embarassing, to just terrible, to very bad, to bad, to nearly not bad. Hopefully that trajectory continues, as I'd like to see 3 competitors instead of just 2.

                                Comment

                                Working...
                                X