Announcement

Collapse
No announcement yet.

AMD Radeon VII Linux Benchmarks - Powerful Open-Source Graphics For Compute & Gaming

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by TheYoshiGuy View Post
    For the emulator-specific issues, they won't fix anything (see bridgman's comment here
    That's not actually what I said - I said we would probably need to stay away from debugging emulator code. If someone else investigates what is happening at emulator level then comes back and says "the driver is doing X when I expect it to be doing Y" (eg with a non-emulator test case) I'm sure we would fix that.

    I doubt our competitors are doing any more than that - it's just a matter of what GPU the emulator and WINE developers happened to be using during development.
    Last edited by bridgman; 08 February 2019, 11:24 AM.
    Test signature

    Comment


    • #62
      Hi Bridgman,

      could you possibly let me know if there are any enhancements to the Radeon VII for virtualisation over the Vega cards? I know the MI50 has MxGPU, and I would presume that it has fixed the ability to re-assign the card between VMs in PCI passthrough without a cold reboot, which a number of AMD cards haven't handled well. Is the VII the same?

      At the moment I'm running/starting up again, a Xen system with a couple of Nvidia cards (GTX480 soft modded to Quadro 6000, 780Ti to be hard modded to a Quadro K6000). The VII I have mixed feelings about - I'm impressed by your day one open source drivers, that you've clearly substantially improved the power/temperature monitoring, and that it's roughly equivalent to an RTX2080. However, it's also hot, power hungry, noisy, and doesn't include all the RTX2080 features. If the VII did include SR-IOV/MxGPU support and issue free PCI passthrough VM assignment in a supported (or at least not an Nvidia 'we're going to take away passthrough when running Geforce drivers in a VM' arbitrary decision) configuration it would be the one thing that could help me to switch.

      I'm not really expecting that MxGPU would reach the VII, but if I could replace a GTX480 and a 780Ti with one card, simultaneously running Windows and FreeBSD VMs on a Linux Xen core (most resources assigned to the VM with the greatest 3D requirement), that would be exceptional.

      Comment


      • #63
        Hi ynari,

        I haven't heard a statement either way about SR-IOV and Radeon VII, so best to assume that it will not be supported for now. If I find out differently I will try to let you know.

        For what it's worth, I have passed along the message that there are a number of consumer/developer use cases where a much smaller number of virtual functions would be sufficient, such as the scenario you described. I don't think we have the hardware capability for that level of control today (it's all or nothing AFAIK) and of course we can't really enforce it in the open source drivers, but will see what I can find out.
        Test signature

        Comment


        • #64
          Wow, I will buy this graphic card but have to find any discount

          Comment


          • #65
            Thank you very much! It's what I expected but any information would be welcome.

            I wouldn't really call myself an AMD fanboy (I do have a PC Engines firewall with embedded AMD CPU at home), but it does seem to me that with the new features introduced by Intel and Nvidia, AMD should occasionally do the same. I know secure VM functionality has been added in recent chips, but other virtualisation or novel new offload/acceleration functions just seem that they could gain a foothold and market share (I'm not counting the APU functionality for general purpose offload, as this was not extended to higher end processors and therefore seems more of a gimmick than a serious direction).

            Obviously on the embedded side (firewalls with more niche processors, consoles, etc), AMD has some huge strengths, but this doesn't always seem to translate into the consumer/server space.

            Comment


            • #66
              Originally posted by TazKhaelyor View Post

              Most of what I've read points to Navi being lower-end GPUs, to replace Polaris (who still in most cases has a better perf/price ratio to Nvidia's competition almost 3 years after release). But maybe that's just confirmation bias...
              AdoredTV seems to have a couple of good sources which hinted that AMD doesn't want to make the same mistake as with Polaris again (where they did cancel the high-performance chip in favor of waiting for Vega to fill that role, but it turned out that Vega was reliant on expensive HBM2, was very late and had a couple of HW issues whereas a high-end Polaris chip with GDDR5 or G5X could have had the same performance for a much lower cost). While all of it is still open for debate, I assume they want to ship Navi in the full spectrum, beginning with the low-mid-level chips and possibly a high end variant later on in 2020. Consumer Vega 20 cards are just a stop gap measure.

              Comment


              • #67
                Originally posted by bridgman View Post
                Hi ynari,

                I haven't heard a statement either way about SR-IOV and Radeon VII, so best to assume that it will not be supported for now. If I find out differently I will try to let you know.

                For what it's worth, I have passed along the message that there are a number of consumer/developer use cases where a much smaller number of virtual functions would be sufficient, such as the scenario you described. I don't think we have the hardware capability for that level of control today (it's all or nothing AFAIK) and of course we can't really enforce it in the open source drivers, but will see what I can find out.
                SR-IOV + GPU passthrough would be certainly a great feature to make Radeon VII even more appealing for enthusiasts as it would provide a solution e.g. for playing native Windows games on Linux in a VM at near native performance levels. The guys at Level1Techs are even advocating for this feature to trickle down to the consumer space which would make the Linux desktop experience more attractive for a broader audience.

                And congrats to the Linux driver team for providing a better experience than your colleagues from the Windows driver team this time around.
                Last edited by ms178; 08 February 2019, 04:18 PM.

                Comment


                • #68
                  Originally posted by bridgman View Post

                  That's not actually what I said - I said we would probably need to stay away from debugging emulator code. If someone else investigates what is happening at emulator level then comes back and says "the driver is doing X when I expect it to be doing Y" (eg with a non-emulator test case) I'm sure we would fix that.

                  I doubt our competitors are doing any more than that - it's just a matter of what GPU the emulator and WINE developers happened to be using during development.

                  Thanks for clarifying. I'm sure the emulator devs having Nvidia cards plays a big roles as well.
                  P.S. I wasn't trying to misrepresent what you said or anything like that.

                  Comment


                  • #69
                    Originally posted by TheYoshiGuy View Post
                    Thanks for clarifying. I'm sure the emulator devs having Nvidia cards plays a big roles as well.
                    P.S. I wasn't trying to misrepresent what you said or anything like that.
                    Understood... I didn't think you were trying to misrepresent either.
                    Test signature

                    Comment


                    • #70
                      Originally posted by ynari View Post
                      or at least not an Nvidia 'we're going to take away passthrough when running Geforce drivers in a VM' arbitrary decision
                      This decision was not arbitrary, customers were buying much cheaper GeForce cards in servers instead of Quadro cards and Nvidia was loosing big money. Linux users were hit as a side effect.

                      Comment

                      Working...
                      X