Originally posted by Nille_kungen
View Post
Announcement
Collapse
No announcement yet.
How To Setup Your Linux System For The Radeon RX Vega
Collapse
X
-
Originally posted by LinAGKar View PostNo, he's saying AMD should create their own proprietary compute API.
Test signature
Comment
-
Similar to Michael I provide builds of the amd-staging kernel since quite a while (including security fixes and one with even Ubuntu additions):
Kernel binaries (amd64) of amd-staging with DAL and latest security patches - M-Bab/linux-kernel-amdgpu-binaries
They also have Vega support. To get it Michael's article can be simplified a bit:
Vega Kernel Support -> Install one of the 4.12 kernels
The Microcode Binaries -> Install the firmware debian package
RadeonSI Mesa + LLVM -> Use one of the two PPAs mentioned in the Readme to "max out OpenGL performance"
Kano : Ensure you have the right kernel (Michael's or one of mine) and firmware blobs installed and boot the correct kernel. Consult the FAQ of my Readme.
Rodrix : Recent Ubuntus run GPUs like the Radeon RX 560 quite decent with the open source amdgpu driver. They only lack some fancy features like HDMI audio where you need the special kernel. Checkout the Readme of the link at the top for more info. The kernels there should also work with Ubuntu 16.04 - you can try it out without the need to upgrade Ubuntu.
Comment
-
Originally posted by eydee View Post
Having many GPUs is not enough, you need many GPUs that can mine efficiently. That's something Michael doesn't have. It's not like he has 50 RX 480s for benchmarking stuff.
1) fraud. Someone else pays for the electricity. Run it on your company's server farm, on your electricity included suite rental, on the power bill you share with your roommates. Nothing rewards you like stealing from others behind their backs.
2) money laundering. If you run a POP you can easily turn electricity into crypto currency and you have the perfect excuse for drawing all that electricity, and you can invoice all your cashflow and pay the taxes. Minting your own currency, in volume, right under the noses of IRS, lol.
- Likes 1
Comment
-
Originally posted by Michael View Post
Unfortunately, have no hardware (and thus no PTS integration) for proper noise testing.
Comment
-
Originally posted by Nille_kungen View PostHow's the 10-bit HEVC decoding?
(But most likely your player will need to map the decoded video then into an 8-bit bt.709 colorspace for display, because there does not seem to be support for emitting a 10-bit-per-channel video signal using amdgpu, yet. I saw some commit comments that might indicate support for > 8 bit per color channel is going to be implemented, but I would not know a way to activate it at this time. Also, I did not see indication of a feature to signal electro-optical transfer functions like SMTPE ST-2084 on the HDMI output.)
- Likes 1
Comment
-
Originally posted by Qaridarium
a microphone output gives you dB... i think [email protected] want something what outputs sone
and this is expensive you can buy a sone set for like ~1300€
Comment
-
If the fan noise is ~6 dB over the environment noise floor, you can disregard the environment. The only thing Michael has to do is to measure the noise floor and make sure the sound pressure level of the GPU is ~6 dB SPL on top of it.
Sone is also just a metric, nothing you have to buy equipment for. I can write a simple script that translates your measured sound-data into sone within 5 minutes. What you need to do is to calibrate your test setup. For this you normally need a measurement-microphone and an anchor (reference beeper). Those things are expensive, as they are special equipment. The poor-mans method here is to just use relative measurements and create that anchor for yourself. This is in most cases within ~1dB of error margin if you do it well.
Comment
Comment