Announcement

Collapse
No announcement yet.

NVIDIA HGX-2 HPC/AI Server Platform Offers 16 x V100 GPUs, 2 PFLOPS of Tensor Cores

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA HGX-2 HPC/AI Server Platform Offers 16 x V100 GPUs, 2 PFLOPS of Tensor Cores

    Phoronix: NVIDIA HGX-2 HPC/AI Server Platform Offers 16 x V100 GPUs, 2 PFLOPS of Tensor Cores

    The HGX-2 is an impressive beast, but will cost an incredible amount too...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    This is sort of off topic, but why not? I'm a software developer in my early 40s, and I enjoy my work (and really, really enjoy my paycheck). I figure in order to stay employable into my 60s I have to not only stay current with standard industry-wide trends but also develop skill with one of these new trends that look like a growth area.

    The first area that seems like a growth opportunity is the one covered by this, GPU computing.

    The second area is AI and Machine Learning, maybe in conjunction with GPU computing or maybe on its own.

    The third area is robotics.

    The fourth is quantum computing and quantum-resistant cryptography.

    The fifth might be fully homomorphic encryption. https://en.wikipedia.org/wiki/Homomo...hic_encryption

    ...I'm a reasonably bright person, but there's a good chance I lack the brainpower to grasp the central concepts for the last two.

    Comment


    • #3
      How much exactly is "... an incredible amount too"?

      Comment


      • #4
        Originally posted by devius View Post
        How much exactly is "... an incredible amount too"?
        I talked to a sales representative about a month ago, can't recall if the figure was for DGX-1 or the new model with the volta's but it was around 400k USD mark I think. So at least that amount?

        Comment


        • #5
          How many Us does this take? I don't see it mentioned anywhere in the docs either.

          Comment


          • #6
            Originally posted by Michael_S View Post
            This is sort of off topic, but why not? I'm a software developer in my early 40s, and I enjoy my work (and really, really enjoy my paycheck). I figure in order to stay employable into my 60s I have to not only stay current with standard industry-wide trends but also develop skill with one of these new trends that look like a growth area.

            The first area that seems like a growth opportunity is the one covered by this, GPU computing.

            The second area is AI and Machine Learning, maybe in conjunction with GPU computing or maybe on its own.

            The third area is robotics.

            The fourth is quantum computing and quantum-resistant cryptography.

            The fifth might be fully homomorphic encryption. https://en.wikipedia.org/wiki/Homomo...hic_encryption

            ...I'm a reasonably bright person, but there's a good chance I lack the brainpower to grasp the central concepts for the last two.
            AI or machine learning is quite hyped atm, pace of development there reminds me a bit of web dev side of things, so I'm not sure if it's a good time to jump in on that or not. The foundational stuff is more or less the same as it has been for a while, and TensorFlow seems to have established itself as worth investing some time into learning. Atm, if you're doing research stuff like you see on arxiv and more about how you can use machine learning with todays hardware compute power, it's perhaps more of a complimentary skill? Others use it for research/analysis, you can find quite a few source code projects that while the functionality might be neat, the code itself is not very good quality or documented. So worth taking note of things like that.

            Robotics, more of utilizing embedded skills with microcontrollers to control servos and sensors. Beyond that delving a bit further into math and engineering side which I'd say you can pick up some of, but aren't likely to be a candidate taken seriously for work that is more demanding of those skills vs the software development bits, so not much changes here? Rust is an interesting language you could look into, embedded dev is still maturing there but is looking to be quite nice alternative to C/C++.

            GPU compute, did you have anything in mind? Or just offloading gneral computation that parallelizes well on the GPU? You might benefit from grokking the concepts and how to program for a GPU, then using a library like ArrayFire. That might upskill you in a way that you can utilize the GPU well, it will provide an API that JIT compiles a kernel for opencl/cuda/etc. You could learn the GPU compute languages directly if you feel that's where you want to head, take a look at job position openings and see if the demand for it is worthwhile.

            The other two I can't comment on and I don't think they're worth it to be relevant unless you have a strong passion/interest for them and feel you could compete with others for it, I don't think they'd be mainstream any time soon though?

            Then again this advice is coming from someone in their early 30's who has been having trouble getting work in software development(I'd like to really, really enjoy a paycheck too some day :P ). Your more than likely better to consider progressing the career path from 40's to 60's I think, rather than try to stay relevant with new trends like those mentioned. Management and the like might not be your thing, but having management skills(if you don't already) as a fallback could be a better a better play.

            Comment


            • #7
              Where are the Glxgears or Tuxracer benchmarks?

              Comment


              • #8
                Originally posted by Michael_S View Post
                This is sort of off topic, but why not? I'm a software developer in my early 40s, and I enjoy my work (and really, really enjoy my paycheck). I figure in order to stay employable into my 60s I have to not only stay current with standard industry-wide trends but also develop skill with one of these new trends that look like a growth area.
                Haven't you pretty much answered your own question? These servers are for mainframes and emphasize the tensor cores, so yes, AI, machine learning, "Big Data", physics simulations, medical research (ie, protein folding), etc are what this is likely targeted toward.
                The third area is robotics.
                I think these servers are overkill for robotics. Just one of the GPUs in this server ought to suffice for a robot.
                The fourth is quantum computing and quantum-resistant cryptography.
                If by "quantum computing" you mean calculations for quantum physics, then sure, this server would be great for that. But if you meant as a quantum computer, then no, it doesn't work that way. That's kind of like comparing legs to wheels - they're both forms of mobility but they function entirely differently. Quantum computers make calculations based on the principles of quantum mechanics. They use qubits, rather than bits.

                Comment


                • #9
                  Originally posted by schmidtbag View Post
                  Haven't you pretty much answered your own question? These servers are for mainframes and emphasize the tensor cores, so yes, AI, machine learning, "Big Data", physics simulations, medical research (ie, protein folding), etc are what this is likely targeted toward.

                  I think these servers are overkill for robotics. Just one of the GPUs in this server ought to suffice for a robot.

                  If by "quantum computing" you mean calculations for quantum physics, then sure, this server would be great for that. But if you meant as a quantum computer, then no, it doesn't work that way. That's kind of like comparing legs to wheels - they're both forms of mobility but they function entirely differently. Quantum computers make calculations based on the principles of quantum mechanics. They use qubits, rather than bits.
                  Sorry, when I listed potential areas to explore I meant in general terms, not specific to this particular hardware. I agree this gadget fits AI and machine learning but not robotics, quantum computing, or homorphic encryption.

                  Comment


                  • #10
                    Originally posted by Michael_S View Post
                    This is sort of off topic, but why not? I'm a software developer in my early 40s, and I enjoy my work (and really, really enjoy my paycheck). I figure in order to stay employable into my 60s I have to not only stay current with standard industry-wide trends but also develop skill with one of these new trends that look like a growth area.

                    The first area that seems like a growth opportunity is the one covered by this, GPU computing.

                    The second area is AI and Machine Learning, maybe in conjunction with GPU computing or maybe on its own.

                    The third area is robotics.

                    The fourth is quantum computing and quantum-resistant cryptography.

                    The fifth might be fully homomorphic encryption. https://en.wikipedia.org/wiki/Homomo...hic_encryption

                    ...I'm a reasonably bright person, but there's a good chance I lack the brainpower to grasp the central concepts for the last two.
                    I'm pretty sure you're underestimating the level of specialization needed.

                    Also, unless you can just teleport anywhere in the world at will, I strongly suggest to look at what is trending in your general area (city or nation).

                    For example in my city the industrial automation sector is very hungry and can't find enough programmers, is that on your list? No. Should it? Maybe.

                    EDIT: it's not OVERestimating, it's UNDERestimating.
                    Last edited by starshipeleven; 31 May 2018, 10:53 AM.

                    Comment

                    Working...
                    X