Announcement

Collapse
No announcement yet.

NVIDIA GeForce GTX TITAN X Linux Testing Time

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA GeForce GTX TITAN X Linux Testing Time

    Phoronix: NVIDIA GeForce GTX TITAN X Linux Testing Time

    Last week NVIDIA released the GeForce GTX TITAN X, their latest $999+ USD graphics card. This new graphics card packs 12GB of GDDR5 video memory and the Maxwell-based GPU is capable of 7 TFLOPS of single-precision compute power. Now it's time for some Linux benchmarks of this new high-end graphics card at Phoronix...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Michael, you are such a tease lol!

    Comment


    • #3
      Regarding potential test, I'm wondering how bad this card is in FP64 (double precision) computing as Nvidia crippled its FP64 units as opposed to what they did for the previous flavors of the Titan's line. This is quite surprising by the way as one could expect Titan's product high compute capabilities against the game-oriented x70/x80 GTX. So, Michael can you tell us how bad it that?

      I'm also wondering if there is any OCL 1.2 support in the recent Linux drivers. I can tell that on my Mac OS X, my Nvidia card is OpenCL 1.2 capable as claimed by my Mathematica benchmark/tests but what about Linux?

      Thanks

      Comment


      • #4
        Originally posted by adakite View Post
        Regarding potential test, I'm wondering how bad this card is in FP64 (double precision) computing as Nvidia crippled its FP64 units as opposed to what they did for the previous flavors of the Titan's line. This is quite surprising by the way as one could expect Titan's product high compute capabilities against the game-oriented x70/x80 GTX. So, Michael can you tell us how bad it that?

        I'm also wondering if there is any OCL 1.2 support in the recent Linux drivers. I can tell that on my Mac OS X, my Nvidia card is OpenCL 1.2 capable as claimed by my Mathematica benchmark/tests but what about Linux?

        Thanks
        Sure, if you find some automate-friendly tests that can answer what you want.
        Michael Larabel
        https://www.michaellarabel.com/

        Comment


        • #5
          Originally posted by hoohoo View Post
          Michael, you are such a tease lol!
          More or less just a call for feedback if there's any new interesting automated tests I haven't seen yet... And for those thinking of buying such a card, to know Linux tests are in fact coming soon.
          Michael Larabel
          https://www.michaellarabel.com/

          Comment


          • #6
            Wink wink new beta driver:
            Release highlights since 346.47: Added support for G-SYNC monitors when used together with non-G-SYNC monitors. When G-SYNC is enabled, non-G-SYNC monitors will display with tearing. Fixed a bug that caused nvidia-settings to crash when assigning an attribute whose value is a display ID on a system with multiple X screens. Updated the reporting of in-use video memory in the nvidia-settings control panel to use the same accounting methods used in other tools such as nvidia-...

            Comment


            • #7
              Originally posted by Michael View Post
              Sure, if you find some automate-friendly tests that can answer what you want.
              CUDA SDK examples is a good start I can think of.

              Comment


              • #8
                Michael, do you get to keep that $1000 graphics card or are you required to return it to Nvidia after you're done testing it?

                Comment


                • #9
                  Originally posted by Xaero_Vincent View Post
                  Michael, do you get to keep that $1000 graphics card or are you required to return it to Nvidia after you're done testing it?
                  Pretty sure he gets to keep it. They've sent him some pretty pricey things in the past that he got to keep, and Michael sure as hell will put more benchmark use to it than most review websites. Also, Michael has stated for years that nvidia is generally the better choice for a GPU in linux, so nvidia is definitely getting their money's worth by sending him stuff to keep.

                  Comment


                  • #10
                    Originally posted by Xaero_Vincent View Post
                    Michael, do you get to keep that $1000 graphics card or are you required to return it to Nvidia after you're done testing it?
                    As with the many other GPUs, they are kept around and used for benchmarking articles for years to come.
                    Michael Larabel
                    https://www.michaellarabel.com/

                    Comment

                    Working...
                    X