Announcement

Collapse
No announcement yet.

How To Setup Your Linux System For The Radeon RX Vega

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by pal666 View Post
    not really. if your fans are close to background noise, that is already very good result
    True, if the test environment is quieter than the fans you want to test. But you seem to be forgetting that this is a server room. Server rooms are notoriously noisey, ie 60..100dB, such that it's competitive with anywhere from light conversation to an intimate jazz band performance. That's not quiet enough to give meaningful results for people who care about quiet fans, lol.

    Michael would need to use a separate room. After having gone to that effort, normalising the test environment is more than worthwhile. If you've ever sat in a "quiet" room in a home near a road or with an outside window ( do you have an airport within 20 miles or a neighbour who occasionally mows their lawn? ) etc with a sensitive dB meter (my flautist friend has a Larson-Davis Model 831), you'll have been surprised to see that meter making transient leaps into the low 20's when a truck rolls by, or some other thing you might not identify even though you could (barely) hear it. That's because the meter will measure the amplitude of <30hz sounds which your ear can hear but typically it appears to us 20-60dB lower than it sounds to us. By the time you can hear it at all, it's actually pretty loud to a sensitive instrument. Many people have a hard time hearing even 60hz hum from the power lines in their house, their electric heater, the fridge in the other room, the wallwart behind your couch, etc, but you yourself can probably hear those if you listen carefully. At her apartment even the wall wart showed up around 12dB when the meter is within 1m of it. Setting a noise floor just helps keep you honest about the results when you have walked away and are looking at the peak and average after 5 minutes, and believe that nothing happened to affect the results.

    Comment


    • #42
      Originally posted by dwagner View Post

      Hardware-decoding of 10-bit HEVC (using vaapi) works fine with amdgpu in recent kernels.

      (But most likely your player will need to map the decoded video then into an 8-bit bt.709 colorspace for display, because there does not seem to be support for emitting a 10-bit-per-channel video signal using amdgpu, yet. I saw some commit comments that might indicate support for > 8 bit per color channel is going to be implemented, but I would not know a way to activate it at this time. Also, I did not see indication of a feature to signal electro-optical transfer functions like SMTPE ST-2084 on the HDMI output.)
      The only value of 10-bit HEVC at the moment is that it gives you some extra gamut to choose from before exporting 8 bits of it to your display. Most displays are incapable of 8 bits per channel and use FRC to approximate 8-bit per channel. Many people are using video overlays that give 8 bits to the luma channel but less to the color components which further limits the gamut of your display for video. So then what do I mean about the value of the extra gamut to choose from? Well if you are watching a show that was recorded at a different gamma than you enjoy, perhaps due to your lighting scenario or due to limitations of your display or both, then you probably want to push the gamma of the video away from the curve it was encoded at. That discards gamut due to rounding errors. Then you may want to brighten it - a good use of that extra gamut is to record 90% of the show somewhat dark so that the bright moments can actually appear to the viewers like they have stepped into the sun... well if your display's gamut is poor then you would want to brighten (stretch the contrast) so that the majority of the show uses the full gamut of the display and the bright scenes are overexposed. Anyhow the deeper gamut of the recorded material means that if you do have a good quality 8-bit display capable of 100% of SRGB color space, then even though you tweaked the gamma and brightness of the source to make the content more enjoyable for your display and lighting scenario, you aren't going to end up with banding/blocking in subtle gradient areas. So 10-bit sources can still valuable even on 8-bit displays.

      Edit: Sorry, lol... for content developers 10-bit is great because it preserves more quality of the original content. That needs to be mentioned as well.
      Last edited by linuxgeex; 16 August 2017, 05:16 AM.

      Comment


      • #43
        Originally posted by LinAGKar View Post

        No, he's saying AMD should create their own proprietary compute API.
        This is more or less my point. The CUDA library is a huge binary, but it works very well for its purpose. So not necessarily closed source like CUDA, but just something to make GPU accelerated FFMPEG/Compute more realistic.

        Comment


        • #44
          Hello.
          Please excuse me for my poor English, it is not my native language.


          Maybe I will be seen stupid, but I can not understand how to setup my linux system for the radeon rx vega

          I have [x]ubuntu 17.04 and vega64 card. So I want to make it work as primary card (I mead, I want to plug monitor to it, not to integrated one).

          And I need help, because the article is too complicated for me (I dont know much of radeon driver stack).

          Here are my questions:

          1. As far as I understand, kernel in modern ubuntu distros misses a feature, so the the card cannot work at stock kernels. But why "Ubuntu 16.04" is mentioned as supported distro in "The Easy Way: AMDGPU-PRO" chapter?
          Or I am missing something?


          2. What exactly I have to install to use the card with each driver?
          Only new kernel (Michaels' one or self-built)? Or also X server nad mesa from padoka or oilbaf ppa?
          If I want to try open source driver, where I can download package with driver?

          As far as I understood,
          * For amdgpu-pro, I need only new kernel (and the driver itself, ofc)
          * for opensource stack, I need the kernel, X and mesa from padoka PPA (package names are xserver-xorg-video-amdgpu and mesa)

          Am I right? Or I missed something?

          3. When I tried Michael's kernel, my wi-fi was not working. Why could that be? The kernel was compiled with less-than-in-stock modules?

          4. What are microcode binaries and do I need to install them?

          Comment

          Working...
          X