Announcement

Collapse
No announcement yet.

AMD Raven Ridge Graphics On Linux vs. Lower-End NVIDIA / AMD GPUs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by andre30correia View Post
    The big issue here is why we should buy this cpu if you have another since AMD will launch a much faster version in less than year with 7nm.
    ???

    If we were launching something like that so soon I think we (people working at AMD) would have heard about it by now.
    Test signature

    Comment


    • #32
      Michael, could you push TTM like this to 64 MB VRAM buffer size to see what happens?

      https://www.techspot.com/article/157...ory-explainer/

      That expensive RAM sticks nowdays, people will do everything to save something
      Last edited by dungeon; 17 February 2018, 04:29 AM.

      Comment


      • #33
        Originally posted by Michael View Post

        For my purposes of just office/coding/web tasks, Intel graphics are technically fast enough... AMD is vulnerable to Spectre, so you probably mean Meltdown, but as shown in the Raven tests this week that even Intel CPUs mitigated for both are still faster than Raven.
        Doubt that, mobile APU of AMD performed relative better than the desktop counterparts, since the efficiency knee seems to be within the 25-45W range. For that I haven't seen a 1-1 comparison for mobile with similar Notebooks for Intel and AMD.

        On the other hand why did you test some GCN GPU with amdgpu 1.4.0 and others with modeseting 1.19.5? For consistency

        Comment


        • #34
          Originally posted by Qaridarium

          Don't you think that a system from 8. Januar 2009 need an upgrade ~9 years later?
          Why would it? The x86-64 psABI is well defined and doesn't include any new REQUIRED instructions added in any CPUs since it was first standardised. That means SSE2 is required, but SSSE3 is OPTIONAL. Any code which assumes SSSE3 is non-conformant and non-portable. There is no guarantee it will work on any other CPU. That's why feature detection is a thing, along with function multi-versioning. Unless you're running Gentoo (where binaries are typically only local and non-distributed, or else the CPU architecture is well defined like with ChromeOS), there's no reason why any x86-64 code will not run on any x86-64 past, present or future.

          PhenomII are fast enough, and with undervolting efficient enough not to need to be replaced for most uses. Sure it would probably be the bottleneck when gaming with the latest high end GPUs, but even then I suspect an optimised Vulkan graphics engine taking advantage of all the CPU cores would work well enough.

          Even Industry or Military grade computers have a maximum life-time of 10 years.
          That's nonsense. As an extreme example the U.S. nuclear weapon command and control systems are only being replaced now, they've been running IBM Series/1 mainframes since the mid-1970s. https://en.wikipedia.org/wiki/IBM_Series/1

          Comment


          • #35
            Originally posted by _ONH_ View Post
            On the other hand why did you test some GCN GPU with amdgpu 1.4.0 and others with modeseting 1.19.5? For consistency
            I was simply using what was selected by default.
            Michael Larabel
            https://www.michaellarabel.com/

            Comment


            • #36
              Originally posted by s_j_newbury View Post
              Any code which assumes SSSE3 is non-conformant and non-portable.
              Don't expect games to be portable at all.

              Devs will just optimize for the most popular hardware (i.e. Intel and NVIDIA) and be done with it, because margins are low. The patch fest to fix Windows games under Ryzen only happened because the bias was too obvious, and there was some ugliness as well.

              This is why AMD has to fight with more cores/threads: nobody cares about microarchitectural differences. Having more cores usually does the trick, but not always. Ryzen does better than the FX because it isn't as dependent on software optimizations, but that doesn't mean they aren't needed. Still, this is the message people get.

              Comment


              • #37
                Originally posted by dungeon View Post
                Michael, could you push TTM like this to 64 MB VRAM buffer size to see what happens?
                We did some testing a year or so ago as part of an effort to convince our business folks to stop promoting APUs paired with weak dGPUs, and at the time we did still find that larger carveout did improve performance. My takeaway was that the main reason people were thinking a small dGPU was faster than APU graphics was that the default carveout on APU was so low (32-80MB per Microsoft requirement), and that configuring a system with APU only was preferable as long as some or all of the power and thermal budget previously given to the dGPU could be given to the APU instead.

                That said, the linked article focused on games whose VRAM requirements were larger than the largest carveout option (2-3GB) and so missed the "everything fits in emulated VRAM" scenario which is still pretty common for games.

                My current (albeit unconfirmed) understanding is that Raven should not show much performance difference between emulated VRAM and system memory (earlier APUs were more like 2:1, with Carrizo somewhere in between) and so "automatic migration to VRAM where possible" could probably be disabled completely, but I don't know how much of this is reflected in current drivers. There has been some ongoing work related to migration but my impression was that it was more related to dGPU than APU.

                So definitely would be interesting to see. I believe Raven is the first APU where emulated VRAM and system memory really could have the same performance, since all of the accesses go through the same data paths (the common data fabric) anyways.
                Last edited by bridgman; 17 February 2018, 11:49 AM.
                Test signature

                Comment


                • #38
                  Originally posted by NateHubbard View Post

                  That your almost decade old CPU is missing an instruction isn't surprising. That you're trying to game with it is though.
                  But, like you said you're overdue for an upgrade, just don't expect your new CPU to still run everything in the year 2027.
                  Do you know how long AMD was stuck with Phenom II performance or incremental gain?
                  Hint : look at this A12-9800 above your post.

                  Comment


                  • #39
                    Originally posted by grok View Post

                    Do you know how long AMD was stuck with Phenom II performance or incremental gain?
                    Hint : look at this A12-9800 above your post.
                    Phenom II would be at least 30% slower. Stop talking out of your ass.

                    Comment


                    • #40
                      Originally posted by edwaleni View Post

                      If AMD dies you would see a big movement to ARM for those avoiding Intel. (Via won't/cant respond) Qualcomm would probably step in to the vacuum. AMD isn't going to die soon, I would guess they are going to get swallowed by a much larger fish before they cease to exist.

                      Siemens or Qualcomm would be the most likely entity for them to align with.

                      My RR arrived yesterday, but the MSI board is in FedEx/USPS limbo at the moment. The DDR4 arrives today.

                      Honestly it appears that there still needs to be a few rounds of kernel/driver updates before RR is considered optimal for Linux in general. I wont be pushing Linux onto this RR right now, but will see how it does when things settle down.
                      Well, the RR project will have to wait a few more days. Amazon Prime sent me a butterfly phone case in 1 day instead of the DDR4 I ordered. Talk about a foobar. Back to NewEgg now.

                      DDR4 prices have really gone up.

                      Comment

                      Working...
                      X