Announcement

Collapse
No announcement yet.

Woah, AMD Releases OpenGL 4.0 Linux Support!

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    *** Brain Error Log***
    Tesselation support still missing, WTF? unigine heaven 2.0 with OpenGL amd Tesselation not available... me confused!
    ***End Of Brain Error Log***

    OK Good to hear that AMD has OpenGL 3.3/4.0 support before nVidia but still i would like to know what about AMD tesselation for OpenGL 3.2. I would be really glad to see the Ungine Heaven 2.0 benchmark running with tesselation on my Linux box....

    Comment


    • #32
      Tessellation is supported by the driver, but it performs poorly in 10.1 and 10,4 beta(no idea how this performs though). The Heaven benchmark simply doesn't use tessellation by default. You can enable it in the config file in the data folder. In fact even 3xxx and 4xxx cards can use tessellation in OpenGl

      Comment


      • #33
        Using OpenGL 4 it would be possible to use tesselation up to level 64 instead of 15. But for that Unigine has to update the benchmark.

        Comment


        • #34
          oh good to hear that! thanks for the info guys! oh by the way does nVdia's drivers has support for tesselation too?

          Comment


          • #35
            Originally posted by kUrb1a View Post
            I started post with hate and despair It took me a while to understand that this is actully good news. :P
            ME TOO!!!

            THIS IS REALLY GREAT NEWS!!! The NEWS that i have been waiting for so long time!!
            TODAY AMD REALLY REALLY MAKE MY DAY!! GOD BLESS !!!

            IM SOOOO HAPPPYYYYYYYYY !!!

            Comment


            • #36
              Originally posted by dopehouse View Post
              I notice some little performance boost in HeroesOfNewerth. With the 10-2 driver and before, I got little lags. But with this driver, everything runs very smoothly.
              I tried it with Nexuiz, and my frame rate went from 18 to 30 FPS at max settings. Everything seems quite a bit speedier.

              Comment


              • #37
                Originally posted by Setlec View Post
                *** Brain Error Log***
                Tesselation support still missing, WTF? unigine heaven 2.0 with OpenGL amd Tesselation not available... me confused!
                ***End Of Brain Error Log***

                OK Good to hear that AMD has OpenGL 3.3/4.0 support before nVidia but still i would like to know what about AMD tesselation for OpenGL 3.2. I would be really glad to see the Ungine Heaven 2.0 benchmark running with tesselation on my Linux box....
                Tesselation support in OpenGL is only supported by ATI. The hardware has existed since the Radeon HD 2xxx series, but has been largely ignored. From my tests of it on my Radeon HD 4650, performance is pretty poor. I am not sure if that is due to the hardware being slow, or unoptimized drivers, or both. Point being, if you have an ATI card you can use tesselation, as long as you have a HD 2xxx series card and plan on running OpenGL. It would be interesting to see how fast OpenGL tesselation is in Windows...

                Comment


                • #38
                  Originally posted by kUrb1a View Post
                  First!!!
                  Can someone chech is this version xserver 1.7 compatible.And I know that it is based on 10.3...,soooo... still...tell me
                  Very sorry for any rude language to follow but I'm ever so slightly inebriated.
                  ATI for the first time ever gave OpenGL support before nVidia and you're b****ing about xserver support?
                  While I understand that it is of course annoying not being capable to use the driver, I think if you have any business using fglrx the very first thing in your comment shouldn't have been about xserver. It would like in an armistice you'd first ask yourself what you're going to do with all your weapons.
                  </rant>
                  Sorry but this is to momentous news to let someone poop on it. Now I'm cool bro.
                  Maybe Valve move to OSX and openGL is putting some fire under ATI's openGL team. Valve has always been buddies with ATI.

                  Comment


                  • #39
                    Well ATI may be the first this time, but tesselation has got a huge performance impact up to 75%! Lets see how Heaven with OpenGL 4 support will perform.

                    Comment


                    • #40
                      I hate to be Mr. Negative around here, and I know this is a beta driver, but...

                      Since the driver doesn't work on Lucid, I popped in an old Ubuntu 8.10 Live CD I had lying around and installed the driver in there. All the Unigine benchmarks give a black screen and spam the console with shader compiler errors.

                      Code:
                      Fragment shader failed to compile with the following errors:
                      ERROR: 0:313: error(#132) Syntax error: 'sample' parse error
                      ERROR: error(#273) 1 compilation errors.  No code generated
                      Anyone else tested this driver with any of the Unigine benchmarks on an Evergreen card?

                      Comment


                      • #41
                        it has yet to be determined if ATI's 57xx and lower series cards can actually be considered OpenGL 4.0 compliant, because the spec requires 64bit floats which is something only the dual-precision 58xx and higher cards are capable of.

                        Comment


                        • #42
                          Originally posted by R3MF View Post
                          it has yet to be determined if ATI's 57xx and lower series cards can actually be considered OpenGL 4.0 compliant, because the spec requires 64bit floats which is something only the dual-precision 58xx and higher cards are capable of.
                          Allow me to allay your fears.
                          http://www.geeks3d.com/20100317/rade...ations-on-gpu/

                          Plus, to anyone wondering, the tessellation unit in the r6xx derivatives will not be good enough for GL_ARB_tessellation_shader. It uses a completly different tesselation algoritm (odd rather than even of DX11) and is in the wrong part of the pipeline. The GL_AMD_vertex_shader_tessellator also didn't expose all functionality expose in their DX driver. It lacks the rather crucial ability to set a per edge tesselation factor.

                          Comment


                          • #43
                            Originally posted by koenvdd View Post
                            Maybe Valve move to OSX and openGL is putting some fire under ATI's openGL team. Valve has always been buddies with ATI.
                            ATI's Mac OS X drivers are "firegl-class" and professional OpenGL support is excellent. I doubt that the ATI OpenGL team is stroking the weed to much.

                            As I've said in another post, the problem revolves around Linux' video/3D model and the 'diversity' of the system. DRI will take care of a lot of these problems along with the X-Server rework.

                            If you ask me, with all the diversity and forks in the Linux/OSS world, one wonders why there is only "one" X-Server... The problem is the video drivers, the graphics server and the gpu offloading ( umbrella term for delegating executable code either for 3d or parallel computing to another execution unit ) are tightly coupled in a very unfortunate way. From my experience, in the long term this is the problem. X-Server & co were designed to work with a 'framebuffer'. Now we are in a position where video/3d/specialized computation are literally offloaded into another address space, on another execution unit with an alien architecture from the execution unit the kernel & user land works in.

                            I can't go into details because of NDA and the like, but we have a "video card" ( we nicked them Voodoo X ) here that is a quad socket quad core opteron machine, with 2 video cards ( firegl or quadro ) connected to the rendering controller by a quad 10GBASE-SR ( 2 direct, 2 indirect, through a controlled/distributor) interface. We have our own linux kernel stacks on both ends and even a modified X ( for internal use only ) and it shows badly that the whole house was designed in a time when graphic cards sat in ISA slots and top dog was an ATi made in Canada with 512k ( half-a-meg! ) which I still have at home, bolted on a wall...

                            If it were up to me, I'd make OpenGL mandatory for X-Server, make the whole "widget"-drawing SVG based and offload it to the GPU, build graphics libraries that GTK & co depedend on top of it and add the "compositor" for video and 3d (also delegated to the GPU) in the kernel maybe separated into kernel mode "drivers" for 2d (svg/raster+comp) 3d and video and one ore more user space compositors into the X-Server. Let the "master" compositors from user space delegate requests to a multitude of kernel space compositors. That way makes it easy at a higher level of abstraction to turn off physical cards, move execution from a dedicated card to an on-chip one for power saving and such or shuffle execution from one unit to another, for example when the more powerful unit is jacking off rendering SVG icons while a HD video stream is pending decoding and you move the light load to the IGP and let the main GPU decode video without "context switching" (so to speak) which is problematic.

                            Oh crap, I've already gave up too many details...

                            Comment


                            • #44
                              Originally posted by LinuxID10T View Post
                              I tried it with Nexuiz, and my frame rate went from 18 to 30 FPS at max settings. Everything seems quite a bit speedier.
                              I have the same issue. Tested on hon and urt.

                              Comment


                              • #45
                                Originally posted by brouhaha View Post
                                As I understand it, AMD's plan is to release 1.7 compatible drivers as soon as all of the major distributions have upgraded to 1.8.
                                made my day

                                On the other hand, that leaves testers for the open source drivers who keep up with the xorg-server version...

                                Comment

                                Working...
                                X