Announcement

Collapse
No announcement yet.

Catalyst 15.7 Bring Multi-Device Support For OpenCL 2.0, Carrizo Improvements

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by Kano View Post
    It usually does not work with muxless setups. I would be happy if it would work the same as nvidia optimus in that case.
    Hasn't Optimus in linux been totally broken for years? I don;t know but I read a lot of complaints about it.

    Comment


    • #42
      @xeekei

      All i meant is that in the general use case it is not needed to OC your card, but did you try setting CoolBits to 8 (or even 28 for max settings)? You should know the settings you used from your other OS - if you need overvoltage you should be really careful. 144 hz monitors are usually used without vsync (or it would most likely be locked to 72 hz in case of slower fps). But show me a benchmark where you really need the extra fps you gain, basically it would be more important if you use a 60 hz monitor with vsync and the game you play occasionally drops below without OC.

      ftp://download.nvidia.com/XFree86/Li...igoptions.html

      Comment


      • #43
        @duby229

        If you use Bumblebee then you never get full speed. You need to use the "xrandr --setprovideroutput" approach, but that means that the gfx chip is always active. I worte a script that has an option to disable it via bbswitch if not used (needs reboot) - it may not work on all systems, but uses a much simpler way than the Ubuntu switching tool, which is integrated into nvidia-settings/lightdm (needs logout). On Ubuntu you just need to install the nvidia-prime package to be able to use Optimus.

        Comment


        • #44
          Originally posted by djdoo View Post
          Big performance improvements here guys!
          I finally installed Xserver 1.17, sadly I installed Kernel 3.19.8 in order not to get messed with patches and crap (I used to use the stock kernel 3.16.7 from my opensuse 13.2 distro).
          OpenGL version 4.5.13397
          Radeon R9 285 2Gb vram here and an old Phenom II 955 X4. It is the first time I saw 76 FPS in Unigine Valley bench with Ultra settings always, 1920X1080 res, AAx4, Anisotropy x16. Overall score 1463 with these settings, whereas with 15.5 I got less than 1000 and with 14.12 less than 700. Thanks AMD! Even if it came late...
          If only we could use kernels 4+...
          Wow, that is approaching NVidia-class performance. Here with the same settings, I get a score of 1369 on a GTX660. Now my GTX 970 and GTX 980Ti will eat that for breakfast, of course. But it's good to see that AMD is finally getting into the same league. Competition is a good thing!
          FPS: 32.7
          Score: 1369
          Min FPS: 21.2
          Max FPS: 57.5
          Platform: Linux 3.19.0-22-generic x86_64
          CPU model: Intel(R) Core(TM) i7-4790 CPU @ 3.60GHz (3591MHz) x8
          GPU model: GeForce GTX 660 PCI Express 346.59 (2048MB) x1
          Render: OpenGL
          Mode: 1920x1080 4xAA fullscreen
          Preset Custom
          Quality Ultra

          Comment


          • #45
            Originally posted by Ardje View Post
            Do you mean you can actually create the debian packages again, as that stopped somewhere around squeeze.
            As i see Debian packages are now there in repo

            https://packages.qa.debian.org/f/fgl...9T191946Z.html

            Comment

            Working...
            X