Announcement

Collapse
No announcement yet.

NVIDIA Opens Up The Code To StyleGAN - Create Your Own AI Family Portraits

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NVIDIA Opens Up The Code To StyleGAN - Create Your Own AI Family Portraits

    Phoronix: NVIDIA Opens Up The Code To StyleGAN - Create Your Own AI Family Portraits

    This week NVIDIA's research engineers open-sourced StyleGAN, the project they've been working in for months as a Style-based generator architecture for Generative Adversarial Networks...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Wow, this technology is like truly like magic! 😮

    Comment


    • #3
      When will CUDA finally die? I've seen universities that teach it, and it's a great reason not to pick them, as I don't want to waste my time with Nvidia-only technologies. They even provide OpenCL compatibility, so what's their reason to keep pushing that irrelevant dinosaur of GPGPU after what, 11 years?

      Comment


      • #4
        Originally posted by DoMiNeLa10 View Post
        When will CUDA finally die? I've seen universities that teach it, and it's a great reason not to pick them, as I don't want to waste my time with Nvidia-only technologies. They even provide OpenCL compatibility, so what's their reason to keep pushing that irrelevant dinosaur of GPGPU after what, 11 years?
        Many people don't seem to care about greater ideals and principles. They are purely concerned with their immediate, pragmatic needs. Those people will quite reasonably assume that Nvidia & CUDA isn't going anywhere in the next 2 or so years and that their short term problems can be solved with this tech. Of course those same people will experience negative repercussions if/when Nvidia get overtaken by AMD, Intel or some other business in the market and they are stuck buying expensive Nvidia CUDA tech. They'll just happily make another short term decision to fix that though. They'll probably also find a way to blame their current issue on something else, so none of these unprincipled, short term thinkers receive any personal repercussion for their poor decisions. Of course, the company / organisation / nation as a whole will be worse off, but they wont realize they caused us all to be worse off.

        It's my observation that most people are as described above. Since coming to that conclusion my life has been more peaceful. I am no longer surprised by insanity. I no longer push so hard to help those kinds of people "get it". I know they either can't get it or they are simply not inclined to get it.

        TL;DR: you can't fix stupid (just do your best to route around it)

        Comment


        • #5
          Originally posted by DoMiNeLa10 View Post
          When will CUDA finally die? I've seen universities that teach it, and it's a great reason not to pick them, as I don't want to waste my time with Nvidia-only technologies. They even provide OpenCL compatibility, so what's their reason to keep pushing that irrelevant dinosaur of GPGPU after what, 11 years?
          CUDA/NVIDIA has basically cornered the gpu computing market, and you wonder when it will die?

          perception 100 right there.

          Comment


          • #6
            Originally posted by cybertraveler View Post
            Many people don't seem to care about greater ideals and principles. They are purely concerned with their immediate, pragmatic needs.
            To be fair, the whole "great ideals and principles" thing in software is a bubble that exists only for some kinds of PC software.
            Most of embedded, industrial, supercomputing and anything else is built on proprietary technologies with patents, licensing and all that jazz.

            Just like most of the physical world around it. 99.999% of the tools anyone uses is a proprietary design.

            So I would not think most people need to be specifically stupid to choose what most of the world has been using forever. CUDA does have a lot of steam behind it, and NVIDIA isn't likely to die off anytime soon, this is undeniable. Meanwhile AMD was in deep shit only a few years ago.

            Then I agree that most people is either stupid or does not give a fuck, or has superiors that don't give a fuck so has to do quick-and-dirty short-term hacks to get by, but I don't think this specific case is a proof of this.

            Comment


            • #7
              There's something decidedly uncanny about this tech. Can't really put my finger on it.
              Oh well, I'm sure this will never be used for any nefarious purpose. People are basically good, right? Right? Right?!

              Comment


              • #8
                Originally posted by dpanter View Post
                There's something decidedly uncanny about this tech. Can't really put my finger on it.
                Oh well, I'm sure this will never be used for any nefarious purpose. People are basically good, right? Right? Right?!
                I know exactly what you mean and can already think of multiple ways this tech could be used to deceive and generally mess with people.

                It must just be me that's odd. There seem to be so many people in support of X. They must be real right? They've got family photos, twitter accounts and everything! I guess I'll just accept that most people love X and so I'll just do my best to tolerate X too. ... and you know what... maybe they're right. Maybe I'm just looking at X from the wrong perspective. In time I think I'll probably learn to love X just like they do.

                Comment


                • #9
                  At 70000 training photos, there is always a good chance the final image is very close to one of the training photos...

                  Comment


                  • #10
                    Originally posted by Mathias View Post
                    At 70000 training photos, there is always a good chance the final image is very close to one of the training photos...
                    Researchers from NVIDIA, led by Guilin Liu, introduced a state-of-the-art deep learning method that can edit images or reconstruct a corrupted image, one tha...


                    I think the video is more telling.
                    Last edited by milkylainen; 10 February 2019, 02:40 PM. Reason: Sorry about stupid, didn't see it included in the article.

                    Comment

                    Working...
                    X