Announcement

Collapse
No announcement yet.

(23.07.) AMD Catalyst™ OpenGL 4.3 Beta Driver

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    I need to file a bug report with AMD.
    I'm just wondering if anyone else is noticing the same here with l4d2 (beta):
    13.6 beta brightness adjustment does not work
    13.15* beta bright adjustment does not work

    However, the default brightness is higher on the 13.6 beta and essentially unplayable (too dark) on the newest beta driver.

    Comment


    • #17
      Is this "OpenGL" driver supposed to be an addition or replacement of the regular Catalyst driver? If it's a replacement, what does it lack compared to the regular Catalyst driver?

      Comment


      • #18
        Originally posted by Forage View Post
        Is this "OpenGL" driver supposed to be an addition or replacement of the regular Catalyst driver? If it's a replacement, what does it lack compared to the regular Catalyst driver?
        No, it's a replacement for it, blame AMD's silly driver meaning. :P

        Comment


        • #19
          Originally posted by vyrgozunqk View Post
          Yes, it's perfect now!
          I am so freaking happy with this beta driver.

          All issues are gone, and the performance boost with Serious Sam 3 is awesome.

          Before this beta I used the latest stable driver.

          Now, intro vid is about 99.9 % perfect.
          I had to lower the mouse sensitivity.

          I changed the in game resolution back to my desktop default.
          ( from 1280*800 to 1680 * 1050 )

          Screen tearing in firefox = gone

          AMD HD5750 1 gig ram, AMD phenom II X4 at stock 3.2 ghz, 4 gig of ram, openSUSE 12.3 64 bit.

          No more stutter with explosions.

          I now need a benchmark to tell how this compares to direct x on windows.
          Before this beta nobody needed a benchmark to tell it was much worse.
          Last edited by Gps4l; 07-26-2013, 12:29 PM.

          Comment


          • #20
            Originally posted by Calinou View Post
            No, it's a replacement for it, blame AMD's silly driver meaning. :P
            They confused me with this too.

            I installed:

            amd-catalyst-13.15.100.1-x86.x86_64.run

            Comment


            • #21
              Anyone else having issues with this and KWin? On Arch, I have 4.11RC2 and after installed the beta drivers the corners of my windows disappear some times (basicaly the min/max/close buttons on the right, and the app icon/stick button on the left). I also noticed that on sometimes the shadow/glowing border thing around a window may only render on the top/bottom (and around the corner), but the sides would be missing it. I have it pretty much on default - openGL 2.0, QT Raster, 'Accurate' for the scale method and VSYNC set to automatic (this seems new in 4.11).

              Comment


              • #22
                Patch for kernel 3.10

                Here is a patch for kernel 3.10:

                https://github.com/kolasa/fglrx-13.15.100.1

                I use it on Debian Testing 64-bit with kernel 3.10.2 (self-compiled).

                Comment


                • #23
                  In installed this driver, and any attempt to run OpenGL locks X.

                  I think it is related to my multi-monitor setup.

                  Reverting to the older driver works.

                  Comment


                  • #24
                    Yes, there is something wrong with this driver - this is the first driver that actually locks up X and then the whole machine when using advanced opengl apps (such as TF2) or xvba hardware decoding with xbmc.
                    The symptoms were increasingly high hdd activity and laggy cursor which eventually froze up and i had to force reboot the computer - it became completely unresponsive, not even the sysrq key combinations worked.
                    TF2 actually worked the second time, but the issues appeared when i wanted to quit - once i got to actually kill it, but then i got a kernel oops.
                    None of these happens with the 13.6 beta driver.
                    Simpler games such as Urban Terror and Half Life 1 engine games (via Steam) have no issues.

                    I have a A8-5500 and i use Debian Testing 64-bit with kernel 3.10.4

                    Comment


                    • #25
                      This is not an attack on anybody but a serious question.

                      I see allot of people not having a dedicated graphics card, I can't understand why.

                      It might change one day, but integrated graphics and games was never a good idea.

                      Comment


                      • #26
                        Originally posted by Gps4l View Post
                        This is not an attack on anybody but a serious question.

                        I see allot of people not having a dedicated graphics card, I can't understand why.

                        It might change one day, but integrated graphics and games was never a good idea.
                        Why is a bad idea if i may ask?
                        I want to play Hl1/2 based Steam games, and open source games which run just fine on my A8-5500. I dont need another card to suck more power when i have what i need.

                        PS. The 7560D is very stable with every version of fglrx since 13.1 (the first driver i installed on this rig), only this driver achieved the performance of locking up the computer.
                        Also, previously i had a nvidia 8200 (which is much slower than the 7560D) and that was ok too for lesser games like Urban Terror (i even played LOTRO on it via Wine).

                        Comment


                        • #27
                          Of course you may ask.

                          Modern games like crysis 3, bring even high end pc's with a dedicated vid card to its knees.
                          And they don’t have to share some system resources.

                          In every benchmark I have seen so far, the integrated graphics perform less then almost every dedicated vid card.
                          Even the cheaper ones.

                          Maybe we think about different games.

                          But if we look at Serious Sam 3, even my hd 5750 with 1 gig of ram, can't handle it with every setting to the max.

                          Comment


                          • #28
                            Ah, but the sharing is actually becoming an advantage. With one memory space, a texture upload turns from a slow copy to an instant operation.

                            The current high-end APUs already exceed low-end discrete of the same generation, only bottlenecked by RAM. This was sidestepped by Sony in the PS4, and will be fixed in main APUs with the introduction of DDR4.


                            By then it's only the power and cooling that give discrete cards an advantage - an APU will easily match a high-end card, leaving only the power-guzzling Titans of 300+ W TDP to surpass them.

                            Comment


                            • #29
                              I have no idea how good this stress test is for benchmarking.
                              I used the video stress test of half Life 2 lost cost.

                              Both at 1680 x 1250

                              OpenSUSE 12.3 64 bit vs W 7 pro 32 bit
                              OpenGL vs probably Direct X

                              188 - 236 fps.

                              On openSUSE used latest openGL 4.3 beta driver.

                              On windows it looks a bit more fluent, but it does not stutter on openSUSE.

                              AMD phenom II X4 3.2 ghz 4 gig of ram HD 5750 1 gig of ram.
                              Last edited by Gps4l; 07-30-2013, 07:09 PM.

                              Comment


                              • #30
                                Originally posted by curaga View Post
                                Ah, but the sharing is actually becoming an advantage. With one memory space, a texture upload turns from a slow copy to an instant operation.

                                The current high-end APUs already exceed low-end discrete of the same generation, only bottlenecked by RAM. This was sidestepped by Sony in the PS4, and will be fixed in main APUs with the introduction of DDR4.


                                By then it's only the power and cooling that give discrete cards an advantage - an APU will easily match a high-end card, leaving only the power-guzzling Titans of 300+ W TDP to surpass them.
                                I am skeptical about this, but would of course not mind.

                                http://nl.hardware.info/productinfo/...:specificaties

                                GDDR5 memory.

                                As far as I understand it, should they not use DDR5 ?

                                Comment

                                Working...
                                X