Announcement

Collapse
No announcement yet.

RadeonSI Quietly Landed A Shader Cache As A Last Feature For Mesa 11.2

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Azpegath
    replied
    I'm really looking forward too se if this solves my horrible slowdowns in L4D2! It generally runs great, but when I get lots of Zombies it stutters something horribly, but then flows nicely again.

    Leave a comment:


  • marek
    replied
    Originally posted by theriddick View Post
    XCOM2 is a good example of shader stutter, it has it in spades.
    All Unreal Engine games have this stuttering. It's a flaw of the OpenGL support of the engine and it's generally unfixable. An on-disk shader cache should help with subsequent runs of the game, but not the first run.

    Leave a comment:


  • duby229
    replied
    Originally posted by geearf View Post
    12Gb starts to get quite steep (I have 16Gb, but I feel newer games will probably ask even more).
    As the follower poster said, it'd be great to be able to configure the path of the cache (once it gets pushed to a partition), maybe on a per process basis.
    I'm pretty sure the disk cache that eventually gets developed for the OSS drivers will be a different type of beast from the disk cache that the Old Republic online uses. I just gave that as an example of a methodology that hurts performance. I just figured if the OSS devs are aware of one example of a disk cache that doesn't work well at all, then at least that would be in their minds as they implement the disk cache for the OSS drivers.

    Leave a comment:


  • plonoma
    replied
    Originally posted by duby229 View Post

    Cool, that means benchmarking options. That is one of the best things the OSS devs offer imo. It means we can all see how the performance turns out.
    Environment variables are a bad way to specify such options.
    Instead of an environment variable, a per-application setting and a per graphics driver setting would be better for dealing with such cache behaviour.
    (+ a way to determine, set which GPU's are used in benchmarks)
    It's easier to work with for benchmarking applications.

    Leave a comment:


  • geearf
    replied
    Originally posted by duby229 View Post

    That's pretty much the only way to get playable framerates on wine with The Old Republic online. It uses a disk cache that needs to be on a tmp drive. (which brings it from a 6GB minimum requirement to a 12GB minimum requirement)
    12Gb starts to get quite steep (I have 16Gb, but I feel newer games will probably ask even more).
    As the follower poster said, it'd be great to be able to configure the path of the cache (once it gets pushed to a partition), maybe on a per process basis.

    Leave a comment:


  • liam
    replied
    Originally posted by atomsymbol

    Do you mean latency on a rotational disk, not SSD? I have my home directory (and consequently the shader caches, such as ~/.AMD/GLCache) on an SSD.

    Since the moment I bought an SSD, latency isn't an issue in any application I am running - the CPU speed is now the limiting factor.

    Or can latency of the shader cache be an issue even on an SSD?
    Seek times aree STILL an issue with ssds. If it wasn't, you wouldn't see bandwidth plummet when performing random access. Even in the best cases, ssds still have a latency x1000 higher than memory. In fact, they are closer to the access times of a hard disk than to memory.
    Hopefully xpoint will remedy this, somewhat.

    Leave a comment:


  • duby229
    replied
    Originally posted by smitty3268 View Post

    The OSS drivers will almost certainly keep around an environment variable letting you disable the cache whenever you want, for debugging purposes if nothing else.

    There's one already for this radeonsi caching.
    Cool, that means benchmarking options. That is one of the best things the OSS devs offer imo. It means we can all see how the performance turns out.

    Leave a comment:


  • smitty3268
    replied
    Originally posted by duby229 View Post

    That's pretty much the only way to get playable framerates on wine with The Old Republic online. It uses a disk cache that needs to be on a tmp drive. (which brings it from a 6GB minimum requirement to a 12GB minimum requirement)
    The OSS drivers will almost certainly keep around an environment variable letting you disable the cache whenever you want, for debugging purposes if nothing else.

    There's one already for this radeonsi caching.

    Leave a comment:


  • duby229
    replied
    Originally posted by geearf View Post
    You could use a tmp drive and sync it back to HDD on shutdown/restart. (Of course that demands enough RAM, even with a compress FS...)
    That's pretty much the only way to get playable framerates on wine with The Old Republic online. It uses a disk cache that needs to be on a tmp drive. (which brings it from a 6GB minimum requirement to a 12GB minimum requirement)

    Leave a comment:


  • pal666
    replied
    Originally posted by plonoma View Post
    Not everybody has an SSD. Many people still use rotational disk(s).
    then they will preload shaders in page cache

    Leave a comment:

Working...
X