Originally posted by WannaBeOCer
View Post
Announcement
Collapse
No announcement yet.
AMD FidelityFX Super Resolution 3 "FSR 3" Will Be Open-Source
Collapse
X
-
- Likes 9
-
Nobody should be getting too excited about FSR3 until it's properly reviewed.
From what I've seen, DLSS3 is not taken very seriously because of input lag and other problems. AMD could score an easy win with an FSR3 that doesn't have the same problems, is open source, and works on many older GPUs instead of just RTX 4000 cards. Or it could become equally as irrelevant.
If FSR3 is worth using, it would be interesting to see how well it works on the consoles, where 60-120 FPS targets can be hard to hit for XSX/PS5 and XSS.
AMD still needs to make improvements to FSR2 in parallel to FSR3, and maybe they will eventually cough up an FSR4 that works like DLSS2's image reconstruction and requires a GPU with "tensor cores".
Comment
-
Originally posted by stormcrow View Postthe fact remains that there's nothing magically intelligent going on here that aren't customization and refinements of previous enhancement techniques that may or may not have a hardware acceleration customization being applied.
And this is in effect very similar to what Nvidia is doing, as well. The difference is that DLSS seems to have some of the actual neural net information loaded into the firmware.
I think one of the big reasons we don't see Nvidia open sourcing their DLSS code is because it would make it pretty obvious that there's nothing magical going on there, and that the cards are just using technologies that basically every card can fundamentally run, and to give out that code would make reverse engineering it possible.
Comment
-
Originally posted by jeoshua View Post
This is precisely what I think is happening with AMD's touted "AI Cores". Ultimately these are bits of hardware that can be USED for machine learning techniques, but that ultimately are not, themselves, involving any kind of AI. Matrix Accelerators and the like. There isn't any actual "AI" taking place inside them, rather they are enabling techniques that CAN be used for AI workloads.
And this is in effect very similar to what Nvidia is doing, as well. The difference is that DLSS seems to have some of the actual neural net information loaded into the firmware.
I think one of the big reasons we don't see Nvidia open sourcing their DLSS code is because it would make it pretty obvious that there's nothing magical going on there, and that the cards are just using technologies that basically every card can fundamentally run, and to give out that code would make reverse engineering it possible.
Funnily enough there the definition is correct: what we address as 'AI' should be more correctly called Virtual Intelligence -- aka it is scripted and has no consciousness.
'AI' as we use it is a PR stunt and purely based on mathematics (including logic, statistics, calculus, geometry, algebra, etc) -- equivalent to the Virtual intelligence in the game
DLSS, the intel counterpart and FSR are all based on maths and accelerated in hardware using dedicated components. Tensor Cores are matrix multiplication support hardware -- nothing more, no intelligence in there or in the algorithms using them
- Likes 1
Comment
-
Originally posted by Grinness View Post
I recently played Mass Effect.
Funnily enough there the definition is correct: what we address as 'AI' should be more correctly called Virtual Intelligence -- aka it is scripted and has no consciousness.
'AI' as we use it is a PR stunt and purely based on mathematics (including logic, statistics, calculus, geometry, algebra, etc) -- equivalent to the Virtual intelligence in the game
DLSS, the intel counterpart and FSR are all based on maths and accelerated in hardware using dedicated components. Tensor Cores are matrix multiplication support hardware -- nothing more, no intelligence in there or in the algorithms using them
Last edited by WannaBeOCer; 25 March 2023, 12:57 PM.
Comment
-
Originally posted by WannaBeOCer View Post
I think you’re missing the point that DLSS/XeSS can learn on their own when given big data when training. While other techniques require explicit programming.
https://youtu.be/kopoLzvh5jY
The problem is when people hear AI and think Turing Test. Nothing in any of this is actually conscious. It doesn't "think". It's not even "General Artificial Intelligence".
ChatGPT, DLSS... none of this stuff is sitting there thinking things through and coming up with novel ideas on its own. It's just math.
So you're right when you say it's something great, because the math is amazing. He's right when saying that people impart some kind of consciousness to something that is fundamentally just a complicated math equation. And even I am right when saying that there's nothing inherently special or unique among the three different brands of graphics cards, just varying levels of silicon doing that mathematics with their various algorithms.Last edited by jeoshua; 25 March 2023, 01:18 PM.
- Likes 2
Comment
-
Originally posted by jeoshua View Post
No, you're missing that he's saying that all of these are just applied mathematics, not some kind of magic that thinks for itself. Feeding an AI algorithm new data and the weights and biases updating is still "explicit programming", just by a different methodology.
The problem is when people hear AI and think Turing Test. Nothing in any of this is actually conscious. It doesn't "think". It's not even "General Artificial Intelligence".
ChatGPT, DLSS... none of this stuff is sitting there thinking things through and coming up with novel ideas on its own. It's just math.
So you're right when you say it's something great, because the math is amazing. He's right when saying that people impart some kind of consciousness to something that is fundamentally just a complicated math equation. And even I am right when saying that there's nothing inherently special or unique among the three different brands of graphics cards, just varying levels of silicon doing that mathematics with their various algorithms.
When looking at the quality of a self taught upscaler compared to an explicit upscaler I’d rather trust a machine that was self taught to try to mimic a 16k image than the manual steps taken.Last edited by WannaBeOCer; 25 March 2023, 01:48 PM.
Comment
-
Originally posted by WannaBeOCer View Post
I’m just going to state that the definition of deep learning AI hasn’t changed since the 1980s. Seems like games/movies are the reason people assume AI is sentient.
When looking at the quality of a self taught upscaler compared to an explicit upscaler I’d rather trust a machine that was self taught to try to mimic a 16k image than the manual steps taken.
* the algorithm is defined by a human (ann, vs deep ann, vs recurrent nn, vs random forest, vs HMM, vs ....)
* the examples are defined by a human (at very least the class/type appropriate for the task, e.g. upscale, denoise, object recognition, ...)
* the back-propagation is a standard optimization mathematical method defined and implemented by a human
I bet that your 'explicit upscaler' uses the same class of mathematical algorithms (e.g. optimization methods) as the other option
- Likes 2
Comment
-
Originally posted by Grinness View Post
How do you define 'self taught' when :
* the algorithm is defined by a human (ann, vs deep ann, vs recurrent nn, vs random forest, vs HMM, vs ....)
* the examples are defined by a human (at very least the class/type appropriate for the task, e.g. upscale, denoise, object recognition, ...)
* the back-propagation is a standard optimization mathematical method defined and implemented by a human
I bet that your 'explicit upscaler' uses the same class of mathematical algorithms (e.g. optimization methods) as the other option
Comment
Comment