Announcement

Collapse
No announcement yet.

AMD's Marek Olšák Lands Even More OpenGL Threading Improvements Into Mesa 20.1

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    Originally posted by TemplarGR View Post

    IIRC in the latest benchmarks here at Phoronix the nvidia blob was faster in most cases.
    I saw the opposite for cards that were positioned as comparable (for example 5700 XT and GTX 2070). Blob offers no advantages these days.
    Last edited by shmerl; 06 April 2020, 03:49 PM.

    Comment


    • #32
      Originally posted by RBilettess View Post

      Well, define "patient" and "new-ish"?
      I'm waiting 5 month now and the architecture is ~9 month old. Is it a matter of years?
      I don't know. You didn't answer though, are you using the latest driver parts? Other people report that the latest kernel/mesa is decent enough. What i meant about the architecture was that it is different from the older GCN versions and thus it may need some more polish. This happens to all companies and all OSes when they bring a more changed architecture to market, earlier drivers aren't that good. All you can do is be patient. Oh, and if you are using Linux a lot, avoid buying the latest hardware. It is better to be a gen or two behind. In terms of raw performance the difference isn't really that big (Moore's law is dead), you get the hardware dirt cheap, and it has really good open source support at that point.

      Comment


      • #33
        Originally posted by TemplarGR View Post

        I don't know. You didn't answer though, are you using the latest driver parts? Other people report that the latest kernel/mesa is decent enough. What i meant about the architecture was that it is different from the older GCN versions and thus it may need some more polish. This happens to all companies and all OSes when they bring a more changed architecture to market, earlier drivers aren't that good. All you can do is be patient. Oh, and if you are using Linux a lot, avoid buying the latest hardware. It is better to be a gen or two behind. In terms of raw performance the difference isn't really that big (Moore's law is dead), you get the hardware dirt cheap, and it has really good open source support at that point.
        While it does suck when you understand it's bad to be too state-of-the-art, this is simply the truth.

        Comment


        • #34
          Originally posted by RBilettess View Post
          Sigh...
          I really wish AMD or somebody else would fix those "ring vcn_enc0 timeout" errors on my Navi card :-(
          I'm plagued by these since I got that card and nothing is done, it seems.
          How do you reproduce that problem? Could you please open an issue about it here? https://gitlab.freedesktop.org/drm/amd/-/issues

          Comment


          • #35
            Originally posted by RBilettess View Post

            Well, define "patient" and "new-ish"?
            I'm waiting 5 month now and the architecture is ~9 month old. Is it a matter of years?

            Oh and I'm running latest GIT versions of mesa and stuff and the latest amd-staging-drm-next kernel, so no, it definitely isn't fix at the moment. Seems like an issue that isn't exclusive to Navi, but existing on older architectures as well :-(
            Just pointing out that on linux you have to choose hardware that is already supported well. If you had chosen a vega nine months ago, you never would have had problems. Vega is performing damn near flawless right now. Of course this is true for any hardware support on linux not just GPU's. You just have to choose hardware that already has good support. You didn't do that. Bitching that a new hardware on a second class OS doesn't have flawless drivers on launch is pretty dumb. If you had done a faily simple google search before buying you would have been informed.... It's your own fault for not bothering to look first....
            Last edited by duby229; 06 April 2020, 08:41 PM.

            Comment


            • #36
              Keep the improvements coming, another 100 or so and we will be in great shape. Shame NVIDIA refuses to go open-source and tap into MESA damn shame.

              Unfortunately its looking more and more likely that my next GPU will be a 3080TI because I have a BAD feeling about NAVI-2, and NVIDIA's RTX / DLSS2.0 improvements are going to be near impossible for AMD to match or beat... DLSS 2.0 IS FUCKEN AMAZING!

              Comment


              • #37
                Originally posted by theriddick View Post
                Keep the improvements coming, another 100 or so and we will be in great shape. Shame NVIDIA refuses to go open-source and tap into MESA damn shame.

                Unfortunately its looking more and more likely that my next GPU will be a 3080TI because I have a BAD feeling about NAVI-2, and NVIDIA's RTX / DLSS2.0 improvements are going to be near impossible for AMD to match or beat... DLSS 2.0 IS FUCKEN AMAZING!
                You have a "bad feeling" about RDNA2 which is going to be used on both next gen consoles? You don't say? Nice trolling attempt though.

                Comment


                • #38
                  The console GPU's are special custom versions and tailored for the consoles NOT PC gamers.

                  I'm worried that RNDA2 PC GPU's will be too pricey and not have very competitive RTX/DLSS like features (raytracing and supersampling stuff).

                  Comment


                  • #39
                    Originally posted by theriddick View Post
                    The console GPU's are special custom versions and tailored for the consoles NOT PC gamers.

                    I'm worried that RNDA2 PC GPU's will be too pricey and not have very competitive RTX/DLSS like features (raytracing and supersampling stuff).
                    Yeah, you are worried that RDN2 is going to be very pricey so your next gpu is going to be a 3080ti, bad trolling, are you even trying bruh?

                    As for "competitive RTX/DLSS features", DLSS is a joke. No one uses it. As is RTX, severely limited and it humpers performance so much no one plays with it enabled even in the tiny minority of titles that support it for some effects...

                    In any case, worrying about RTX in a Linux-focused site is an oxymoron. I mean, how many titles you can play on Linux with RTX?

                    Comment


                    • #40
                      Originally posted by TemplarGR View Post

                      You have a "bad feeling" about RDNA2 which is going to be used on both next gen consoles? You don't say? Nice trolling attempt though.
                      consoles are based on what is available now and not necessarily what is absolutely best for the job. nvidia tends to pull ahead of competition with gpu tech and amd mostly plays catch-up. it might just be that by the time another console gen hits the market, nvidia will be in upswing with some new tech by that time.

                      Yeah, you are worried that RDN2 is going to be very pricey so your next gpu is going to be a 3080ti, bad trolling, are you even trying bruh?
                      he also wrote that rdna2 may not be very competitive feature-wise, i think he's looking at general value of the product, not just the price.

                      In any case, worrying about RTX in a Linux-focused site is an oxymoron. I mean, how many titles you can play on Linux with RTX?
                      for ML cluster at place where i work at, we bought quite a few of those RTX cards. i assume rtx has some computational advantage over previous models, it was not my call to purchase them.

                      and linux is not solely about desktops and gaming.

                      Comment

                      Working...
                      X