Announcement

Collapse
No announcement yet.

How To Setup Your Linux System For The Radeon RX Vega

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by Nille_kungen View Post
    I think the firmware should be in the kernels firmware tree, we shouldn't need to go to separate source for firmware.
    https://git.kernel.org/pub/scm/linux...it/tree/radeon
    I believe Alex will be pushing it to the kernel's tree after the embargo lifts:

    Test signature

    Comment


    • #32
      Originally posted by LinAGKar View Post
      No, he's saying AMD should create their own proprietary compute API.
      How about an open compute API based on standard C++ along with tools that (a) automate most of the effort porting from CUDA and (b) generate code which can run on AMD HCC/ROC -or- NVidia's NVCC:


      Test signature

      Comment


      • #33
        Similar to Michael I provide builds of the amd-staging kernel since quite a while (including security fixes and one with even Ubuntu additions):


        They also have Vega support. To get it Michael's article can be simplified a bit:
        Vega Kernel Support -> Install one of the 4.12 kernels
        The Microcode Binaries -> Install the firmware debian package
        RadeonSI Mesa + LLVM -> Use one of the two PPAs mentioned in the Readme to "max out OpenGL performance"

        Kano : Ensure you have the right kernel (Michael's or one of mine) and firmware blobs installed and boot the correct kernel. Consult the FAQ of my Readme.

        Rodrix : Recent Ubuntus run GPUs like the Radeon RX 560 quite decent with the open source amdgpu driver. They only lack some fancy features like HDMI audio where you need the special kernel. Checkout the Readme of the link at the top for more info. The kernels there should also work with Ubuntu 16.04 - you can try it out without the need to upgrade Ubuntu.

        Comment


        • #34
          Originally posted by eydee View Post

          Having many GPUs is not enough, you need many GPUs that can mine efficiently. That's something Michael doesn't have. It's not like he has 50 RX 480s for benchmarking stuff.
          crypto mining is generally a negative net income earner. There's basically two scenarios where you actually make money mining crypto currency:

          1) fraud. Someone else pays for the electricity. Run it on your company's server farm, on your electricity included suite rental, on the power bill you share with your roommates. Nothing rewards you like stealing from others behind their backs.

          2) money laundering. If you run a POP you can easily turn electricity into crypto currency and you have the perfect excuse for drawing all that electricity, and you can invoice all your cashflow and pay the taxes. Minting your own currency, in volume, right under the noses of IRS, lol.

          Comment


          • #35
            Originally posted by Michael View Post

            Unfortunately, have no hardware (and thus no PTS integration) for proper noise testing.
            Also requires a consistent low-noise test environment, so you'd need to lug the equipment to a spare room, set up a white noise generator and adjust it to a baseline dB level (ie 18-20 dBa so birds/street noise won't interfere), then perform tests... huge pain in the behind.

            Comment


            • #36
              Originally posted by Michael View Post

              Unfortunately, have no hardware (and thus no PTS integration) for proper noise testing.
              At this stage I would be happy for a smart phone db meter reading of idle and load

              Comment


              • #37
                Originally posted by Nille_kungen View Post
                How's the 10-bit HEVC decoding?
                Hardware-decoding of 10-bit HEVC (using vaapi) works fine with amdgpu in recent kernels.

                (But most likely your player will need to map the decoded video then into an 8-bit bt.709 colorspace for display, because there does not seem to be support for emitting a 10-bit-per-channel video signal using amdgpu, yet. I saw some commit comments that might indicate support for > 8 bit per color channel is going to be implemented, but I would not know a way to activate it at this time. Also, I did not see indication of a feature to signal electro-optical transfer functions like SMTPE ST-2084 on the HDMI output.)

                Comment


                • #38
                  Originally posted by Qaridarium

                  a microphone output gives you dB... i think [email protected] want something what outputs sone


                  and this is expensive you can buy a sone set for like ~1300€
                  your link has trivial formula to convert one to other. +20 db = *4 son

                  Comment


                  • #39
                    Originally posted by linuxgeex View Post
                    Also requires a consistent low-noise test environment
                    not really. if your fans are close to background noise, that is already very good result

                    Comment


                    • #40
                      If the fan noise is ~6 dB over the environment noise floor, you can disregard the environment. The only thing Michael has to do is to measure the noise floor and make sure the sound pressure level of the GPU is ~6 dB SPL on top of it.

                      Sone is also just a metric, nothing you have to buy equipment for. I can write a simple script that translates your measured sound-data into sone within 5 minutes. What you need to do is to calibrate your test setup. For this you normally need a measurement-microphone and an anchor (reference beeper). Those things are expensive, as they are special equipment. The poor-mans method here is to just use relative measurements and create that anchor for yourself. This is in most cases within ~1dB of error margin if you do it well.

                      Comment

                      Working...
                      X