Announcement

Collapse
No announcement yet.

Valve's Dota 2 Adds AMD FidelityFX Super Resolution

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Teggs
    replied
    Originally posted by Valve
    FidelityFX Super Resolution works on any GPU compatible with DirectX 11 or Vulkan.
    Thanks to Valve for answering that question.

    AMD should have just said so from the beginning, instead of implying that it only worked on Polaris/Pascal and newer.

    Leave a comment:


  • Ropid
    replied
    Originally posted by unic0rn View Post
    [...] it [...] look like shit [...]
    If you don't get aggressive with the amount of scaling you ask it to do, it works well.

    I played a bit while using the 75% scaling slider value and that looked good to me, you can get fooled into thinking it's native resolution if you just play and don't look close.

    The default, naive upscaling that was previously in the game is super bad in comparison. I guess it's just bicubic scaling? With that upscaling, you can immediately see that it's not the native resolution. Meanwhile with the FSR checkbox ticked, it gets very close to how the native resolution looks like. It looks like native resolution with different (and good) AA filtering.

    The 75% scaling value is interesting to think about because it's turning 1440p into 1080p. It's 1080 / 1440 = 0.75. This is good enough for me if it could get into more games than just Dota2. My RX480 is too weak for 1440p but for 1080p it's fine.

    Leave a comment:


  • oiaohm
    replied
    Originally posted by unic0rn View Post
    Sadly, lack of temporal component means it has no chance of competing with DLSS and will always look like shit quality-wise.

    Not sure what they were thinking. Many games feature custom temporal antialiasing solutions that look better than this.
    The reality is temporal antialiasing is patented by Nvidia until 2030-01-11 so if you are a game with temporal antialiasing you really do have to use the Nvidia option or risk a patent problem that will cost you more than what you can get from the game.

    Temporal antialiasing has its fair share of problems as DLSS 1.0 show in a big way. FSR beats DLSS 1.0 in quality by insane major because FSR does not generate temporal graphical artefacts you still see this with DLSS 2.2 in different games where you have visible ghosting that is a temporal graphics artefact.

    Games that have custom scaling that look better than DLSS are normally exactly like FSR being non temporal.

    Temporal antialiasing has really been oversold. It will be good to have a non temporal competitor to DLSS. FSR due to lacking the artefacts caused by temporal methods for quite a few games should be the better solution. For the games where DLSS does truly give advantage the advantage at this stage appears to be minor.

    The reality here in 9 years time when the patent runs out on temporal antialiasing it may be so superseded by non temporal methods that no one will bother about the technology at that point.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by billyswong View Post

    I don't think this will hurt their sales... people probably buy GPUs based on fixed budgets. IMO both DLSS and this FFX is about encouraging people to enjoy games in a higher-resolution / higher-dpi monitor. HUD and in-game text will look clearer without sacrificing overall game FPS. So overall it is a hidpi monitor sales promotion move.
    They buy monitors and displays the same way. I know I do. My PC is hooked up to my TV so when 4K TVs were dirt cheap last Black Friday I upgraded my monitor, only my 580 wasn't ready for that kind of upgrade in regards to gaming (runs a 4K desktop all day long no sweat). What I see with this is a better way to upsample my current 1080p games to 4K over integer scaling. If you're already at 4K60 or whatever then this isn't really useful...neither is DLSS in that scenario.

    I wonder what this'll do for 4K games on 6K and 8K monitor? Seems like that'll be a necessary to have feature in the very near future.

    Leave a comment:


  • DrYak
    replied
    Originally posted by david-nk View Post
    It is a solution that works without the ridiculous processing power of the RTX tensor cores. I don't know if it still uses neural networks or if they somehow found a way to make it look this good with just classical algorithms, but it is impressive either way.
    Hey, Nvidia needs to justify all the AI/ML cores they keep craming into their card for datacenter customers, and that are still keept in the silicon when they package and sell the exact same core to their "g@m3rZ" consumers !

    Originally posted by bple2137 View Post
    Besides, it's way more accessible than DLSS, because it supports a lot more hardware and is open by design. We'll see, but it has a chance to really compete with NV solution.
    Technically, not even DLSS really requires dedicated tensor cores. People have been running neural nets on GPGPU even before tensor cores where a thing.
    The only differences are:
    - tensor cores are heavily optimized (e.g.: completely different precision) for the type of maths that happens when running deep-neural nets. So AI/ML algorithms run way faster there.
    - tensor cores are distinct from your usual cores. Running the AI/ML on tensor doesn't eat your cycle budget from the graphical tasks. Whereas running AI/ML on your usual compute cores competes with the rest of the graphical pipeline.

    So in short, the performance is higher on card with dedicated tensor cores, but not impossible without.
    (See ray-tracing for another example of exactly the same situation).

    But again, Nvidia's marketing department needs a nice story to tell about those (useless for the "G@m3rZ" crowd) tensor cores.
    So let's make DLSS a tensor-core exclusive! (but it's an arbitrary limitation - that can be justified in the name of performance, though).

    Leave a comment:


  • billyswong
    replied
    Originally posted by cytomax55 View Post

    I think you are missing an important point... even if it looks like poo to some people.... most people cant afford or find a new video card.... if there is a tech that will slightly decrease image quality but boost your fps to 50% that would be a huge help to people waiting for that next video card upgrade... imagine how many video cards will be delayed from going into land fills now that your video card just got a new breath of life... honestly i dont see why AMD would even offer this since this may translate to less sales but since they cant keep these cards in stock i dont think its a big issue right now
    I don't think this will hurt their sales... people probably buy GPUs based on fixed budgets. IMO both DLSS and this FSR is about encouraging people to enjoy games in a higher-resolution / higher-dpi monitor. HUD and in-game text will look clearer without sacrificing overall game FPS. So overall it is a hidpi monitor sales promotion move.
    Last edited by billyswong; 24 June 2021, 08:41 AM.

    Leave a comment:


  • skeevy420
    replied
    Originally posted by novhack View Post

    That is exactly what I wanted. I have 5700XT and 4K monitor mainly for work but sometimes for games. I finished the whole Cyberpunk 2077 at 70% CAS scale (2688x1512 -> 3840x2160) and if FSR is even a bit better (which it is) it's a major win in my book.
    Same here. I have an RX 580 4GB I play most games 1080p60 Ultra and integer scale up to 4K because my card shines at 1080p60. In my case if I can take what I'm already playing and switch integer scale to FSR and get (suggestively) better looking graphics then cool beans. To be frank, I've only liked the highest quality FSR setting in the previews. When compared to 4K Native they all look bad aside from the best.

    Originally posted by tildearrow View Post
    Competition begins...
    Let's hope this FSR thing is adopted by more game studios (especially the ones who have been baited by the green side...)
    The Hulk, Jolly Green Giant....I doubt a person's survival after being bated by the green side unless, maybe, it's the Green Lantern.

    Leave a comment:


  • novhack
    replied
    Originally posted by aufkrawall View Post
    I tried it and it mostly just looks like CAS on steroids, which is bad. Lesson: Don't wait for AMD.
    That is exactly what I wanted. I have 5700XT and 4K monitor mainly for work but sometimes for games. I finished the whole Cyberpunk 2077 at 70% CAS scale (2688x1512 -> 3840x2160) and if FSR is even a bit better (which it is) it's a major win in my book.

    Leave a comment:


  • aufkrawall
    replied
    I tried it and it mostly just looks like CAS on steroids, which is bad. Lesson: Don't wait for AMD.

    Leave a comment:


  • cytomax55
    replied
    Originally posted by unic0rn View Post
    Sadly, lack of temporal component means it has no chance of competing with DLSS and will always look like shit quality-wise.

    Not sure what they were thinking. Many games feature custom temporal antialiasing solutions that look better than this.
    I think you are missing an important point... even if it looks like poo to some people.... most people cant afford or find a new video card.... if there is a tech that will slightly decrease image quality but boost your fps to 50% that would be a huge help to people waiting for that next video card upgrade... imagine how many video cards will be delayed from going into land fills now that your video card just got a new breath of life... honestly i dont see why AMD would even offer this since this may translate to less sales but since they cant keep these cards in stock i dont think its a big issue right now

    Leave a comment:

Working...
X