Announcement

Collapse
No announcement yet.

AMD Crimson Linux Release Notes: Glxgears Stuttering Fixed

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #11
    Originally posted by AnonymousCoward View Post
    AMD choose bad time to drop support for older cards on Linux. They could at least wait for Ubuntu 16.04. IT would be a LTS release so you would get five years supported distro where you could install last fglrx driver.
    LTS and 5 years support is on Ubuntu side. They will manage to make your card work.
    AMD does not care about it, basically they "give you the opportunity of an upgrade" .

    Comment


    • #12
      Originally posted by Passso View Post

      LTS and 5 years support is on Ubuntu side. They will manage to make your card work.
      AMD does not care about it, basically they "give you the opportunity of an upgrade" .
      You mean a downgrade, right? GCN is still just a cheap CUDA architecture copy. Except without CUDA...

      You may as well just buy the original.
      Last edited by eydee; 24 November 2015, 01:15 PM.

      Comment


      • #13
        Originally posted by AnonymousCoward View Post
        AMD choose bad time to drop support for older cards on Linux. They could at least wait for Ubuntu 16.04. IT would be a LTS release so you would get five years supported distro where you could install last fglrx driver.
        They waited until the near end of the year, many of us already know about this ATi then AMD 3 year driver drop cadence. To me this is expected, anything other then that would be unusual.

        Whatever happens on earth ATi/AMD had drop support for oldest gen chips every 3 years

        Next comet is expected at lunar 2018. - it is always unofficial of course, but history show us exactly that
        Last edited by dungeon; 24 November 2015, 01:27 PM.

        Comment


        • #14
          Originally posted by eydee View Post

          You mean a downgrade, right? GCN is still just a cheap CUDA architecture copy. Except without CUDA...

          You may as well just buy the original.
          Bad joke. Tests at our physics department show, that NVidia's current cards are very slow when doing double precision calculations and do non-local memory access.

          Comment


          • #15
            OK those are John Walker live changelogs Now 14.04.3 and Ubuntu 15.10 are advertised as supported

            Maybe we should have complete changes tommorow
            Last edited by dungeon; 24 November 2015, 01:39 PM.

            Comment


            • #16
              Originally posted by dcrdev View Post
              Can anyone confirm whether this works when compiled with gcc5 under kernel 4.2.x?
              Tried on Ubuntu 14.04 / GCC 4.9.3 / Linux 4.2 - doesn't works, when system boots - monitor switched to power save mode (off).
              Ubuntu 14.04 / GCC 4.8.4 / Linux 3.19 - works as expected.

              Comment


              • #17
                Originally posted by Scellow View Post
                • Xorg/Xserver 7.4 and above (up to 1.17)
                • Linux kernel 2.6 or above (up to 3.19)


                Hahahahahah you can go fuck yourself AMD, never, never, i'll never buy your card again, i'll make sure all my familly will boycott all of your products
                Use the open driver that comes with the kernel?

                Comment


                • #18
                  I just hope they fixed the horrible bottlenecks with Linux that always have them below the competition. They can't seem to push out frames as fast as Nvidia.

                  Comment


                  • #19
                    Originally posted by Scellow View Post
                    • Xorg/Xserver 7.4 and above (up to 1.17)
                    • Linux kernel 2.6 or above (up to 3.19)


                    Hahahahahah you can go fuck yourself AMD, never, never, i'll never buy your card again, i'll make sure all my familly will boycott all of your products
                    The day AMD dies is the day I will switch to ARM.

                    Also Nvidia told me to fuck myself for wanting virtualization and OpenGL games. So I dont buy their crap.

                    Comment


                    • #20
                      Originally posted by oleid View Post

                      Bad joke. Tests at our physics department show, that NVidia's current cards are very slow when doing double precision calculations and do non-local memory access.
                      Maxwell is limited.

                      Nvidia did not released any serious HPC/Pro GPUs on it.

                      It's brilliant at 32bit computations, DX11 style, cuase all that ripped out stuff freeied space that can now be devoted to something else...

                      But otherwise Nvidia bet on upcomming Pascal to do the trick for compute/HPC/pro markets.

                      Comment

                      Working...
                      X