Announcement

Collapse
No announcement yet.

Initial NVIDIA GeForce RTX 2080 Ti Linux Benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by reavertm View Post

    Impressed? They are slow (especially Vega) compared to their Windows counterparts for some reason. Yes, they are open source, one redeeming quality.
    FUD, FUD everywhere!

    Comment


    • #22
      Two times more performance compared to Vega 64 and two times bigger chip... good!
      Anyway, could AMD beat them if they would double their die size?

      Comment


      • #23
        Originally posted by eydee View Post
        Performance per watt benchmark is the most important one. Vega 64 has been dethroned, 2080 Ti is now the most power hungry, power wasting GPU in the entire universe.

        Numbers looks great, but if you have to pay a fortune, and have to have your own nuclear power plant to feed the thing, it's a piece of crap, regardless of benchmark results.
        I don't think you understand what performance per watt is. Bad performance per watt would be poor efficiency. That means taking a lot of power and not providing the performance levels to justify that.

        The 2080ti takes a lot of power. But it is very efficient. It is consuming the same amount of power as the Vega64, but providing twice the performance. I don't know how anybody can call it power wasting...

        have to have your own nuclear power plant to feed the thing, it's a piece of crap, regardless of benchmark results.
        From the article
        The average AC system power consumption for the complete rig during benchmarking was 270 Watts with a peak of 386 Watts
        A high quality 650 watt PSU will suffice, hardly a nuclear power plant.

        Comment


        • #24
          Originally posted by PCJohn View Post
          Two times more performance compared to Vega 64 and two times bigger chip... good!
          Anyway, could AMD beat them if they would double their die size?
          AMD's next GPU in 2019 will be Navi, which is the final revision of the GCN architecture. Sadly it is rumored to not include high end parts at least in 2019, but we still have some hope as AMD has not confirmed.
          After that it is expected (not officially confirmed) that they will launch a brand new ground up replacement for GCN. You can expect that to scale well from low end to high end, earliest would be 2020 though, maybe even 2021.
          Last edited by humbug; 20 September 2018, 03:35 AM.

          Comment


          • #25
            Originally posted by humbug View Post
            The 2080ti takes a lot of power. But it is very efficient. It is consuming the same amount of power as the Vega64, but providing twice the performance. I don't know how anybody can call it power wasting...
            Ti cards can't be dubbed efficient, let alone very efficient as these are designed to waste power. Non-Ti is usually much better at efficency.

            So, now you know Probably something bellow would be king of efficiency, like an 2070 but again not Ti.

            Plus these FE models always suck even more watts than Retail
            Last edited by dungeon; 20 September 2018, 02:21 AM.

            Comment


            • #26
              Hmm the only thing that could save the 2000 series is how well Ray Tracing and DLSS is implemented, DLSS is meant to give you up to double the framerate at higher resolutions by using the tensor core to process the image (a lower res rendered one) and enhance it so it looks good even thought its upscaled..

              The demo images shown of this look pretty unbelievable so we will see. If it ends up being a rare feature added to games then its going to really suck.

              Comment


              • #27
                Originally posted by eydee View Post
                Performance per watt benchmark is the most important one. Vega 64 has been dethroned, 2080 Ti is now the most power hungry, power wasting GPU in the entire universe.
                Originally posted by dungeon View Post

                Ti cards can't be dubbed efficient, let alone very efficient as these are designed to waste power. Non-Ti is usually much better at efficency.

                So, now you know Probably something bellow would be king of efficiency, like an 2070 but again not Ti.
                Take a look at the graphs again. The 2080 ti was the most efficient out of all the GPUs tested.

                Comment


                • #28
                  Originally posted by LinAGKar View Post
                  Take a look at the graphs again. The 2080 ti was the most efficient out of all the GPUs tested.
                  Which graphs? I am talking about most efficient card of the same 20 serie and here i don't see any other than that 2080Ti FE, top cards of the same serie are never most efficient. Neither are Ti cards ever most efficent, neither are FE

                  If someone wants most of efficiency, he won't look neither for Ti nor FE nor even for OC retail models
                  Last edited by dungeon; 20 September 2018, 02:40 AM.

                  Comment


                  • #29
                    Just keep looking at these graphs, eventually it will hit you

                    https://www.phoronix.com/scan.php?pa...ti-linux&num=6

                    Tip, put your hand over Metro, lets ignore that graph shall we? is that better?

                    Comment


                    • #30
                      Originally posted by TemplarGR View Post
                      Pretty underwhelming card. This is another Geforce 3 or Geforce 5 launch, for those who remember the situation 17 years ago... Good cards with some features that will become mainstream eventually, but at the time of their launch really expensive and not many titles will take advantage of the new features. By the time raytracing effects become common better architectures for better prices will be released in the market, so it makes no sense to pay through the nose for a 2xxx series, unless you are rich and don't care about money or you are a developer and you want to experiment with the new technology.
                      Well this kind of cards with new technology needs to sell or you'll never get prices down. Vicious circle... But I agree the 2080 prices look way too high, even 2070 looks too expensive to me.

                      [/QUOTE]In any case, i am a Linux user, so Nvidia is not an option for me. I don't support companies hostile to opensource. I don't have a need for 4k AAA gaming either, if i did, i would be using Windows 10 anyway...[/QUOTE]
                      I support companies that provide me with working Linux drivers. And that's the case of NVIDIA. I might have been lucky but this has been true for me for more than 15 years

                      I'd love to see NVIDIA follow the AMD path with open source, but that comes in second for me. The priority is driver stability.

                      Comment

                      Working...
                      X