Announcement

Collapse
No announcement yet.

R600 Open-Source Driver WIth GLSL, OpenGL 2.0

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Originally posted by Qaridarium View Post
    thats wrong! wine wins tons of benchmarks!
    wine win on 3Dmark2000 and 3Dmark2001!!!
    These do not use HLSL. They are an entirely different beasts.

    you have a wrong unterstanding abaut the HLSL to GLSL bridge
    there is no need to translate it all the time!

    only the game starts slower!

    after that the complete translatet GSGL code load in the card and run nonstop.
    in theorie there is no speed 'lose' but you can also doe optimations...
    you can handle DX8 code in DX10/DX11 style...
    You translate the code once, when the shader is compiled, but this translation has runtime overhead. In the worst case, the overhead can overflow the capabilities of your card, meaning the resulting code will not run or will fall back to software emulation.

    Meaning you might need a newer card to run old code through wine, when an older card would have sufficed in native D3D.

    a DX9 based game runs well on a X1950.. but the same game loses in wine on this card...
    but a much slower card like the 4350 or 54xx can "Win"
    thats because wine translate the old code into a new openGL3.2 stylish code...

    much better texture compression save ramspeed and bring more fps!
    Win in support yes (see above). Win in speed not really, at least not with these specific cards you quoted.

    what da fu.k?????

    "EXT_geometry_shader" is a nvidia only extansion but OpenGL3.2 do not need this for the same because in ogl3.2 there is a geometry_shader !
    Oh, please.

    Code:
    $ glxinfo
    [...]
    OpenGL renderer string: ATI Radeon HD 4800 Series
    OpenGL version string: 3.2.9232
    [...]
    , GL_EXT_geometry_shader4,
    This is on my Ati 4850 with 9.12 drivers.

    you also can emulate a 'tesselation shader' thats because of the amd-OGL extansions! ...
    DX11-level tesselation works differently than Ati's DX10-level tesselation hardware. It's close but not identical and all discussions I've read on this indicate that these extensions can't be used to emulate DX11-level tesselation. Feel free to prove me wrong, though.

    you do not get the Point of wine...... wine isn't a emulator.-..

    there is no emulator!......

    wine also does not emulate shader HLSL code... wine is a compiler!
    wine is a shader compiler compiles old shader in newstylish shader
    compile HLSL shader into GLSL shader....

    there is no emulator! nativ hardware speed! NO emulator!
    Yeah right, Wine is not an emulator because it recompiles HLSL code to GLSL. I guess pcsx2 is not an emulator either then? Hey, it recompiles mips code into x86!

    Comment


    • Originally posted by BlackStar View Post
      These do not use HLSL. They are an entirely different beasts.
      why does wine win??? over 40% ? ??




      Originally posted by BlackStar View Post
      You translate the code once, when the shader is compiled, but this translation has runtime overhead. In the worst case, the overhead can overflow the capabilities of your card, meaning the resulting code will not run or will fall back to software emulation.
      if your card can handle the output of the compiler the result also can be faster!
      wine is up to 50% faster in WOW directX9 than vista on the same hardware!



      Originally posted by BlackStar View Post
      Meaning you might need a newer card to run old code through wine, when an older card would have sufficed in native D3D.
      you only need the new card for the new extansions...

      you do not need a faster card ...

      DX11 hardware for exampel can have more fps on lower ramspeed only because of the extem good textur-compression!




      Originally posted by BlackStar View Post
      Win in support yes (see above). Win in speed not really, at least not with these specific cards you quoted.
      Long time ago i test this... X850 vs hd4350...

      theoretical the X850 is much faster more shader power more ramspeed...

      but in wine the hd4350 is over 30% faster in 3Dmark03!

      and yes 3dmark03 use shader's!



      Originally posted by BlackStar View Post
      DX11-level tesselation works differently than Ati's DX10-level tesselation hardware. It's close but not identical and all discussions I've read on this indicate that these extensions can't be used to emulate DX11-level tesselation. Feel free to prove me wrong, though.
      you are wrong only because you are the only person talk abaut dx10 hardware...
      you can handle DX11-tessellation on a 5870 by using openGL!
      yes you can't use old hardware for new extensions but the same hardware can do the same....


      Originally posted by BlackStar View Post
      Yeah right, Wine is not an emulator because it recompiles HLSL code to GLSL. I guess pcsx2 is not an emulator either then? Hey, it recompiles mips code into x86!
      PCSX2 emulate the hardware! wine do not emulate any hardware!

      mips to X86 is not the same as HLSL to GLSL...

      low assembler code vs high program language.-

      PCSX2 also emulate in realtime..

      wine does not translate the hlsl code in realtime.. wine does the most of the work before the game starts.

      the 'fps' does not drop because if the code run there is no need to recompile the code.

      Comment


      • Q, BlackStar is saying that when Wine translates shaders it often has to insert additional instructions into the shader code, and it's those additional instructions that could slow down execution relative to running natively on Windows.

        If you reply with "but 3DMarkxxx is faster so that's not true" I'm going to vote for a ban

        Comment


        • Originally posted by bridgman View Post
          Q, BlackStar is saying that when Wine translates shaders it often has to insert additional instructions into the shader code, and it's those additional instructions that could slow down execution relative to running natively on Windows.
          But isn't the point of all the OpenGL 3.2 "Wine extensions" to obviate the need to do this?

          Comment


          • Originally posted by Qaridarium View Post
            Long time ago i test=BlackStar]Win in support yes (see above). Win in speed not really, at least not with these specific cards you quoted.
            this... X850 vs hd4350...

            theoretical the X850 is much faster more shader power more ramspeed...

            but in wine the hd4350 is over 30% faster in 3Dmark03!
            You originally said X1950 vs HD4350 and I really doubt the latter will outperform the former in any meaningful test. X850 is very different in capabilities from the X1950 (SM2.0b vs SM3.0), so the result of this comparison does not transfer to the former.

            Not to mention that this 30% number is meaningless on its own. Did you use the same system? CPU? OS? Driver version? Wine version?

            you can handle DX11-tessellation on a 5870 by using openGL!
            No, you cannot. Not yet. AMD_vertex_shader_tessellator is a very different beast than DX11 tessellator shaders, and we'll have to wait for OpenGL 3.3/4.0 before the necessary functionality is exposed. My guess is that this won't happen before Nvidia releases its own DX11 hardware.

            yes you can't use old hardware for new extensions but the same hardware can do the same....
            Yes, iff the drivers expose this functionality.

            I won't argue the point on Wine/emulation, other than to say that HLSL to GLSL recompilation was not even conceived when the "wine is not an emulator" moto was penned. The "not an emulator" part refers to x86 instructions, not shader code.

            Comment


            • Originally posted by Alex W. Jackson View Post
              But isn't the point of all the OpenGL 3.2 "Wine extensions" to obviate the need to do this?
              Nope. The new interop extensions improve compatibility in a few parts of the pipeline (e.g. VBO loading, polygon rendering) but they don't affect shaders directly.

              Comment


              • Originally posted by BlackStar View Post
                Nope. The new interop extensions improve compatibility in a few parts of the pipeline (e.g. VBO loading, polygon rendering) but they don't affect shaders directly.
                From the definition of ARB_fragment_coord_conventions on opengl.org (emphasis added):

                What is the primary goal of this extension have?

                RESOLVED: The goal is to increase the cross-API portability
                of fragment shaders. Most fragment shader inputs (texture
                coordinate sets, colors) are treated identically among OpenGL
                and other 3D APIs such as the various versions of Direct3D.
                The chief exception is the fragment coordinate XY values which
                depend on the 3D API's particular window space conventions.

                We seek to avoid situations where shader source code must
                be non-trivially modified
                to support differing window-space
                conventions. We also want minimize the performance effect on
                fragment shader execution. Rather than an application modifying
                the shader source to add extra operations and parameters/uniforms
                to adjust the native window coordinate origin, we want to control
                the hardware's underlying convention for how the window origin
                is provided to the shader.
                ?

                Comment


                • Originally posted by Alex W. Jackson View Post
                  From the definition of ARB_fragment_coord_conventions on opengl.org (emphasis added):
                  Bah, forgot about coordinate conversions. This could have some positive impact, but wasn't this available as a NV-specific extension prior to GL3.2?

                  Comment


                  • The article states that I don't get to play "Unigine Heaven on Linux", but.. I do get to play chromium-bsu AND glchess! playing 1080p movies also works just fine, so I'm happy enough as of now.

                    Comment


                    • Originally posted by bridgman View Post
                      If you reply with "but 3DMarkxxx is faster so that's not true" I'm going to vote for a ban
                      thats means... i don't care about theoretical explanation why wine is slower or must be slower if wine in reality is faster.....

                      in the past i do a lot of benchmarks windows vs linux....

                      wine and zlib... wine is 244% faster than windowsXP on my system!...
                      wine is faster on 3dmark2001/2000 to...
                      wine is faster in WOW...

                      i do some more benchmarks wine vs windows

                      Everest AES WindowsXP 13893
                      everest AES Linux 14074

                      everest queen windowsXP 15851
                      everest Queen linux 16031

                      everest zlib windowsXP 26219
                      everest zlib Linux 64104

                      everest photoWorxx windowsXP 8814
                      everest photoworxx Linux 10038

                      7zip WindowsXP 7544
                      7zip Linux 7825

                      so much talking bullshit why wine sould be slower than nativ directX...

                      i talk about why is wine so fast?


                      "I'm going to vote for a ban "

                      admit it that you get paid for it by amd

                      Comment


                      • I'm not sure why you think zlib performance has anything to do with 3d performance...

                        As for wine graphics performance, games like Diablo 2, Counterstrike, World of Goo, Plants vs Zombies are almost unplayable on my Atom/945 netbook and noticeably slower on my Core2/Quadro laptop compared to running them on Windows directly. Maybe your system is powerful enough that you don't notice the slowdown or even get a speed increase through wine but that's far from universal.

                        Finally, it's obvious that brigdman is employed by AMD to ban dissenters from Phoronix (and other forums). Tread carefully, AMD is evil like that.

                        Comment


                        • I think Q might have some kind of mental disorder here. He just keeps repeating the same things over and over again like if he says it enough it might become true, while ignoring everything that everyone else says.

                          I think he might be the first person I've ever seen say that DirectX through WINE is better and less hacky than native OpenGL support.

                          And he still hasn't explained how he thinks OpenGL's impending demise is going to destroy the market for workstation cards, after it was explained that those are marketed to DirectX users as well.

                          And then he keeps bringing up 10 year old applications to test performance while ignoring that blackstar was talking about HLSL and other newer technologies being slow.

                          As far as OpenGL dying itself, it's been dying in certain markets like PC gaming for years. It's never going to disappear completely though, as long as Sony and Nintendo don't feel like paying massive amounts of money to their main competitor in the console market. And new markets like phones and web browsers seem likely to make OpenGL more used than ever before, even if they are in lower end devices.

                          Comment


                          • Hold on, let me check my job description...

                            <snip>

                            23. Where possible, encourage the suspension of, or take other actions to discourage enthusiastic forum members from posting

                            24. Fill out timesheets on a regular basis

                            Yep, guess I do get paid for it.

                            Seriously, I am trying to discourage you from overgeneralizing and drifting off-topic. Your posts could be regarded a lot more seriously if you would just slow down a bit. Be sensitive to that inner voice which occasionally says "you're babbling" and listen to it

                            The issue under discussion was whether current implementations of Wine + driver would exact a performance penalty as a result of Wine inserting additional shader instructions. The fact that one program rums faster does not disprove that, and the other benchmarks you posted (which prob24ably don't use shaders) also argue that another factor must be affecting performance, possibly filesystem or cacheing.
                            Last edited by bridgman; 12-30-2009, 12:56 AM.

                            Comment


                            • Originally posted by smitty3268 View Post
                              I think Q might have some kind of mental disorder here.
                              That is incredibly insulting and insensitive. Please don't say things like that.

                              Qaridarium is just an everyday idiot, nothing more.

                              Comment


                              • i have said it before and i will say it again. people are stupid. all of them.

                                Comment

                                Working...
                                X