Announcement

Collapse
No announcement yet.

NVIDIA GeForce RTX 2060 Linux Performance From Gaming To TensorFlow & Compute

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Would be really interesting to have tensorflow benchmark results also for Vega.

    Comment


    • #12
      Originally posted by pracedru View Post
      Wow... The 2060 is about 75% faster than the 1060.
      That is progress.
      And it matches the 1080 in some games.
      I'm going to buy one of those i think.
      Can you really compare it like that though since it's significantly more expensive than the 1060?

      Average price for a 1060 6GB currently is around $270, so
      360/270=1.333...=33% more expensive

      Shave off the increased price from the 2060 and you're at 50% faster for the same price.
      Still a great improvement, but price/performance is now only head to head with the 580/590 which in my experience has more reliable Linux drivers.
      If you compare products with the same performance though I guess it's slightly better than a Vega 64 due to better efficiency and value, but the proprietary driver and NVidias anti-competitive behavior is still a flaw so I guess it's a tough choice now instead of a clear winner for AMD.

      Comment


      • #13
        Originally posted by Kemosabe View Post
        Thank you for supporting the company with questionable policies and limited proprietary drivers which are just a nightmare for developers which don't just develop games.
        i was actually promising my self that I would buy an AMD gfx card next time, because of the driver issue. I was just amazed by the progress, and maybe I got a little carried away.

        I am actively developing a CAD program and I have absolutely no problems with developing on a system with a Nvidia gfx card. Developing on Linux and targeting Nvidia has consistently been a pleasure on Linux. I cannot say the same about Radeon/AMD. Nvidia has in fact delivered a professional and consistent platform for both Windows and Linux for a very long time. Their driver is not open source though, and that might be a reason for me to switch to AMD.
        But if Nvidia's midrange/low-end gfx card beats the best of the AMD line.... Well hmm. I'll have to think about it.

        Comment


        • #14
          Interesting. I was a bit afraid by the growing TDP but the consumption seems to be contained. Should i understand that the tests measured the wattage of the whole system and not specifically the card ?

          Comment


          • #15
            Originally posted by Dedale View Post
            Interesting. I was a bit afraid by the growing TDP but the consumption seems to be contained. Should i understand that the tests measured the wattage of the whole system and not specifically the card ?
            Right, as said in the article it's the overall AC system power consumption.
            Michael Larabel
            https://www.michaellarabel.com/

            Comment


            • #16
              Originally posted by pracedru View Post

              i was actually promising my self that I would buy an AMD gfx card next time, because of the driver issue. I was just amazed by the progress, and maybe I got a little carried away.

              I am actively developing a CAD program and I have absolutely no problems with developing on a system with a Nvidia gfx card. Developing on Linux and targeting Nvidia has consistently been a pleasure on Linux. I cannot say the same about Radeon/AMD. Nvidia has in fact delivered a professional and consistent platform for both Windows and Linux for a very long time. Their driver is not open source though, and that might be a reason for me to switch to AMD.
              But if Nvidia's midrange/low-end gfx card beats the best of the AMD line.... Well hmm. I'll have to think about it.
              And which CAD program for *Linux* is this supposed to be?
              I can imagine that you have as little issues with an CAD program as with a game tho.
              I'm more referring to the scientific field which includes the development of production ready programs.
              Nvidia's advertisement strategy is highly aggressive and goes so far that it is not even possible to purchase non-nvidia quadro products for the labs as part of a university supplier agreement.
              Also, it is indeed a nightmare to develop with this because of not only proprietary drivers in a heterogeneous environment with a number of LTS Linux distributions with chronically slightly outdated software stack and CUDA, which of course only runs with NVIDIA. But CUDA has such a high market dominance in the field due that there is no way around it.
              And before you keep telling me that nvidia performs better: It might only be relevant in a one man show freelancer company but it is simply unrealistic to assume that the latest generation of dramatically over-prized hardware is available.
              De-facto, non-nvidia was the more efficient and cheaper solution.

              And for my private use NVIDIA is already a no-no due to EGLStream.

              Comment


              • #17
                Originally posted by pracedru View Post
                But if Nvidia's midrange/low-end gfx card beats the best of the AMD line.... Well hmm. I'll have to think about it.
                You know that argument doesn't make any sense at all right?

                A $360 card is not mid-range, it's a high-end card. The whole reason why NVidia is pushing ray-tracing so much is because they want to make games more demanding so people pay more for their products. By pushing RTX they are trying to sell chips with similar die-size as some of their enterprise chips to consumers for incredible amounts of money. If you look at the Radeon Instinct MI60 you can see that AMD are clearly able to compete with NVidias Tesla V100 in terms of TFLOPS, they're just not as aggressive with their marketing to consumers.

                Comment


                • #18
                  Price, performance and power consumption look like a card that should end at 70. This isn't the super cheap but competitive mainstream card for the masses. Like the 1060 was before, the 960 was before, the 760 was before etc. This should be called a 2060 Ti at the very least. Or 2070 and the current 2070 should be 2070 Ti.

                  Comment


                  • #19
                    Originally posted by pracedru View Post
                    Wow... The 2060 is about 75% faster than the 1060.
                    That is progress.
                    And it matches the 1080 in some games.
                    I'm going to buy one of those i think.
                    Surprisingly, the RTX2060 is a bigger, higher-power card than a GTX 1070, and system power consumption tends to be noticeably higher. in some cases the 2060 draws more power than a 1080.

                    Comment


                    • #20
                      Originally posted by Michael View Post

                      Right, as said in the article it's the overall AC system power consumption.
                      Thank you for your kind answer. May ask you why ? Is it by design or some technical difficulty ?

                      A well regarded French website used to do tests including the wattage of the cards themselves and sometimes IR images of the thermal radiation by IR camera. I guess IR cameras aren't cheap. They ceased when their main tester left for greener pastures.

                      If you are interested here is an example: https://www.hardware.fr/articles/957...photos-ir.html

                      Comment

                      Working...
                      X