Announcement

Collapse
No announcement yet.

AMD FidelityFX Super Resolution 3 "FSR 3" Will Be Open-Source

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #21
    Originally posted by WannaBeOCer View Post

    AI is an umbrella term.

    Nvidia/Intel engineers explained how they trained a neural network with 16K images. Then the AI attempts to reconstruct the image in real time which is accelerated with AI dedicated hardware like Tensor cores/XMX cores since they can sort through thousands of images in a second.

    Looking at both DLSS and XeSS using AI dedicated hardware both are superior to FSR in terms of quality. When XeSS isn’t using dedicated hardware it looks worse and performance isn’t much improved.
    No. "AI" has become a marketing term. It has lost all meaning because of that since it's being applied where it's not appropriate (like here) to sell people on the company's marketability to investors. You can spout all the buzzwords you want, but the fact remains that there's nothing magically intelligent going on here that aren't customization and refinements of previous enhancement techniques that may or may not have a hardware acceleration customization being applied. Intelligence implies the ability to intuitively adapt to changing conditions without massive amounts of training. There's not a thing on the market labeled "AI" that actually meets that criteria. What's going on here is basically just beating the subject with a club over and over again till the polymorphic program changes its parameters sufficiently to be half way usable. That's not intelligence. It's just a pattern matching algorithm that's not quite as arthritic as in the past.

    Comment


    • #22
      Nobody should be getting too excited about FSR3 until it's properly reviewed.

      From what I've seen, DLSS3 is not taken very seriously because of input lag and other problems. AMD could score an easy win with an FSR3 that doesn't have the same problems, is open source, and works on many older GPUs instead of just RTX 4000 cards. Or it could become equally as irrelevant.

      If FSR3 is worth using, it would be interesting to see how well it works on the consoles, where 60-120 FPS targets can be hard to hit for XSX/PS5 and XSS.

      AMD still needs to make improvements to FSR2 in parallel to FSR3, and maybe they will eventually cough up an FSR4 that works like DLSS2's image reconstruction and requires a GPU with "tensor cores".

      Comment


      • #23
        AMD: we offer this as open source, come grab it!

        everyone: nah, we want closed source proprietary crap that keep us locked into Dear Leader Jensen jail!

        They simply cant win, by reading all the negative posts in here.

        Comment


        • #24
          Originally posted by stormcrow View Post
          the fact remains that there's nothing magically intelligent going on here that aren't customization and refinements of previous enhancement techniques that may or may not have a hardware acceleration customization being applied.
          This is precisely what I think is happening with AMD's touted "AI Cores". Ultimately these are bits of hardware that can be USED for machine learning techniques, but that ultimately are not, themselves, involving any kind of AI. Matrix Accelerators and the like. There isn't any actual "AI" taking place inside them, rather they are enabling techniques that CAN be used for AI workloads.

          And this is in effect very similar to what Nvidia is doing, as well. The difference is that DLSS seems to have some of the actual neural net information loaded into the firmware.

          I think one of the big reasons we don't see Nvidia open sourcing their DLSS code is because it would make it pretty obvious that there's nothing magical going on there, and that the cards are just using technologies that basically every card can fundamentally run, and to give out that code would make reverse engineering it possible.

          Comment


          • #25
            Originally posted by jeoshua View Post

            This is precisely what I think is happening with AMD's touted "AI Cores". Ultimately these are bits of hardware that can be USED for machine learning techniques, but that ultimately are not, themselves, involving any kind of AI. Matrix Accelerators and the like. There isn't any actual "AI" taking place inside them, rather they are enabling techniques that CAN be used for AI workloads.

            And this is in effect very similar to what Nvidia is doing, as well. The difference is that DLSS seems to have some of the actual neural net information loaded into the firmware.

            I think one of the big reasons we don't see Nvidia open sourcing their DLSS code is because it would make it pretty obvious that there's nothing magical going on there, and that the cards are just using technologies that basically every card can fundamentally run, and to give out that code would make reverse engineering it possible.
            I recently played Mass Effect.
            Funnily enough there the definition is correct: what we address as 'AI' should be more correctly called Virtual Intelligence -- aka it is scripted and has no consciousness.
            'AI' as we use it is a PR stunt and purely based on mathematics (including logic, statistics, calculus, geometry, algebra, etc) -- equivalent to the Virtual intelligence in the game

            DLSS, the intel counterpart and FSR are all based on maths and accelerated in hardware using dedicated components. Tensor Cores are matrix multiplication support hardware -- nothing more, no intelligence in there or in the algorithms using them

            Comment


            • #26
              Originally posted by Grinness View Post

              I recently played Mass Effect.
              Funnily enough there the definition is correct: what we address as 'AI' should be more correctly called Virtual Intelligence -- aka it is scripted and has no consciousness.
              'AI' as we use it is a PR stunt and purely based on mathematics (including logic, statistics, calculus, geometry, algebra, etc) -- equivalent to the Virtual intelligence in the game

              DLSS, the intel counterpart and FSR are all based on maths and accelerated in hardware using dedicated components. Tensor Cores are matrix multiplication support hardware -- nothing more, no intelligence in there or in the algorithms using them
              I think you’re missing the point that DLSS/XeSS can learn on their own when given big data when training. While other techniques require explicit programming.

              Last edited by WannaBeOCer; 25 March 2023, 12:57 PM.

              Comment


              • #27
                Originally posted by WannaBeOCer View Post

                I think you’re missing the point that DLSS/XeSS can learn on their own when given big data when training. While other techniques require explicit programming.

                https://youtu.be/kopoLzvh5jY
                No, you're missing that he's saying that all of these are just applied mathematics, not some kind of magic that thinks for itself. Feeding an AI algorithm new data and the weights and biases updating is still "explicit programming", just by a different methodology.

                The problem is when people hear AI and think Turing Test. Nothing in any of this is actually conscious. It doesn't "think". It's not even "General Artificial Intelligence".

                ChatGPT, DLSS... none of this stuff is sitting there thinking things through and coming up with novel ideas on its own. It's just math.

                So you're right when you say it's something great, because the math is amazing. He's right when saying that people impart some kind of consciousness to something that is fundamentally just a complicated math equation. And even I am right when saying that there's nothing inherently special or unique among the three different brands of graphics cards, just varying levels of silicon doing that mathematics with their various algorithms.
                Last edited by jeoshua; 25 March 2023, 01:18 PM.

                Comment


                • #28
                  Originally posted by jeoshua View Post

                  No, you're missing that he's saying that all of these are just applied mathematics, not some kind of magic that thinks for itself. Feeding an AI algorithm new data and the weights and biases updating is still "explicit programming", just by a different methodology.

                  The problem is when people hear AI and think Turing Test. Nothing in any of this is actually conscious. It doesn't "think". It's not even "General Artificial Intelligence".

                  ChatGPT, DLSS... none of this stuff is sitting there thinking things through and coming up with novel ideas on its own. It's just math.

                  So you're right when you say it's something great, because the math is amazing. He's right when saying that people impart some kind of consciousness to something that is fundamentally just a complicated math equation. And even I am right when saying that there's nothing inherently special or unique among the three different brands of graphics cards, just varying levels of silicon doing that mathematics with their various algorithms.
                  I’m just going to state that the definition of deep learning AI hasn’t changed since the 1980s. Seems like games/movies are the reason people assume AI is sentient.

                  When looking at the quality of a self taught upscaler compared to an explicit upscaler I’d rather trust a machine that was self taught to try to mimic a 16k image than the manual steps taken.
                  Last edited by WannaBeOCer; 25 March 2023, 01:48 PM.

                  Comment


                  • #29
                    Originally posted by WannaBeOCer View Post

                    I’m just going to state that the definition of deep learning AI hasn’t changed since the 1980s. Seems like games/movies are the reason people assume AI is sentient.

                    When looking at the quality of a self taught upscaler compared to an explicit upscaler I’d rather trust a machine that was self taught to try to mimic a 16k image than the manual steps taken.
                    How do you define 'self taught' when :
                    * the algorithm is defined by a human (ann, vs deep ann, vs recurrent nn, vs random forest, vs HMM, vs ....)
                    * the examples are defined by a human (at very least the class/type appropriate for the task, e.g. upscale, denoise, object recognition, ...)
                    * the back-propagation is a standard optimization mathematical method defined and implemented by a human

                    I bet that your 'explicit upscaler' uses the same class of mathematical algorithms (e.g. optimization methods) as the other option

                    Comment


                    • #30
                      Originally posted by Grinness View Post

                      How do you define 'self taught' when :
                      * the algorithm is defined by a human (ann, vs deep ann, vs recurrent nn, vs random forest, vs HMM, vs ....)
                      * the examples are defined by a human (at very least the class/type appropriate for the task, e.g. upscale, denoise, object recognition, ...)
                      * the back-propagation is a standard optimization mathematical method defined and implemented by a human

                      I bet that your 'explicit upscaler' uses the same class of mathematical algorithms (e.g. optimization methods) as the other option
                      The AI can modify/creates new algorithms in response to learned inputs and data as opposed to relying solely on explicit programming. I linked a video earlier in regards to reinforcement learning.

                      Comment

                      Working...
                      X