Page 1 of 6 123 ... LastLast
Results 1 to 10 of 53

Thread: Gaming On Wine: The Good & Bad Graphics Drivers

  1. #1
    Join Date
    Jan 2007
    Posts
    15,137

    Default Gaming On Wine: The Good & Bad Graphics Drivers

    Phoronix: Gaming On Wine: The Good & Bad Graphics Drivers

    If you're wondering what graphics hardware / driver is best if you intend to do much gaming within Wine or CrossOver on Linux, a CodeWeavers' developer responsible for much of the Direct3D layer spoke about his experiences with gaming on Wine for the best results...

    http://www.phoronix.com/vr.php?view=MTI5NjU

  2. #2

    Default

    That is a pretty damning assessment against AMD. Between this and their lack of support for the Southern Islands/HD7000 series cards, I do not plan on making any future AMD GPU (or CPU) purchases. I say that sitting in between three PCs, two Linux and one Windows, each with AMD cards in them.

    The video itself was pretty good. Sometimes those talk videos are dry and boring, but that one was pretty good and well-paced. I also appreciate the honesty from the presenter, in acknowledging where WINE and Linux gaming in general falls short.

    I really appreciate your assessment notes, Michael.

  3. #3
    Join Date
    Aug 2009
    Posts
    97

    Default

    This comming from an entirely Nvidia friendly piece of software does not surprise me.

    Latest case being Path of Exile. In the Wine discussion thread on the game forum, we managed to find out that disabling GLSL use for Wine (registry key) makes all AMD cards no longer DirectX9 capable. Nvidia cards on the other hand are still DX9 capable. Since both companies differ only in custom extensions, I bet Wine uses NV extensions for shaders while it sees GLSL disabled. AMD has to go back to ARB shader extensions which means only GL 2.1 shading is available.

    I do admit AMD has their own shit to clean up (catalyst posting different capabilities when it detects Wine running in some cases), but Nvidia special treatment from Wine devs is in some cases too much.

  4. #4
    Join Date
    Oct 2007
    Location
    Dresden
    Posts
    53

    Default

    Quote Originally Posted by haplo602 View Post
    This comming from an entirely Nvidia friendly piece of software does not surprise me.

    Latest case being Path of Exile. In the Wine discussion thread on the game forum, we managed to find out that disabling GLSL use for Wine (registry key) makes all AMD cards no longer DirectX9 capable. Nvidia cards on the other hand are still DX9 capable. Since both companies differ only in custom extensions, I bet Wine uses NV extensions for shaders while it sees GLSL disabled. AMD has to go back to ARB shader extensions which means only GL 2.1 shading is available.

    I do admit AMD has their own shit to clean up (catalyst posting different capabilities when it detects Wine running in some cases), but Nvidia special treatment from Wine devs is in some cases too much.
    Code:
            if ( !get_config_key( hkey, appkey, "UseGLSL", buffer, size) )
            {
                if (!strcmp(buffer,"disabled"))
                {
                    ERR_(winediag)("The GLSL shader backend has been disabled. You get to keep all the pieces if it breaks.\n");

  5. #5
    Join Date
    Jun 2010
    Location
    ฿ 16LDJ6Hrd1oN3nCoFL7BypHSEYL84ca1JR
    Posts
    1,052

    Default

    This should probably be considered too: http://phoronix.com/forums/showthread.php?72506

    If you don't buy AMD cards and get a nvidia card instead you don't support the company that at least tries to release documentation and even open source code and support an open source unfriendly company. Good job.

  6. #6

    Default

    Both AMD and NVidia are failing to support Linux developers and users. The entire reason I bought those HD6000 series cards was, in fact, because of AMD's promises to support Linux.

    But, here we are years later and the AMD end-user experience is actually worse than the NVidia end-user experience. That is my hands-on experience with their proprietary drivers. The implementation matters. As an end-user, I don't give a fk why one experience is better than the other; I just recognize what is.

    Ultimately, fk em both. Intel may end up with the best Linux graphics experience here soon.

    I won't go into my work experiences with HPC (bio drug sim) GPUs, but AMD's failure there colors my opinion quite a bit too.
    Last edited by TurbulentToothpick; 02-08-2013 at 04:03 AM.

  7. #7
    Join Date
    Nov 2010
    Posts
    41

    Default r600g is getting good even for Wine

    Personally i'm impressed with r600g performance with Wine. I was able to play two Tomb Raider series games (Legend and Anniversary Edition) in a HD 5650 on my HP notebook with mesa-git and they ran really smooth (I don't know if this is great at all, but I was expecting a lot less)

    Additionally, I tested Windows version of Heroes of Newerth in Wine and got nearly the same fps (around 30) for Direct3D and OpenGL rendering (in the native version too)

  8. #8
    Join Date
    Jan 2010
    Posts
    159

    Default

    Why did I already know this would come down to AMD bashing when I just read the title?

    To all the problems stated here with AMD graphics, I'm having surprisingly little.

    I think in compaison people should start bashing Nvidia, because they don't even make an effort to support open-source on the desktop. Why would you support them not supporting us?

  9. #9
    Join Date
    Aug 2009
    Posts
    97

    Default

    Quote Originally Posted by .CME. View Post
    Code:
            if ( !get_config_key( hkey, appkey, "UseGLSL", buffer, size) )
            {
                if (!strcmp(buffer,"disabled"))
                {
                    ERR_(winediag)("The GLSL shader backend has been disabled. You get to keep all the pieces if it breaks.\n");
    you can find lots of NVIDIA preferential treatment in arb_program_shader.c, examples:

    Code:
        /* Always enable the NV extension if available. Unlike fragment shaders, there is no
         * mesurable performance penalty, and we can always make use of it for clipplanes.
         */
        if (gl_info->supported[NV_VERTEX_PROGRAM3])
        {
            shader_addline(buffer, "OPTION NV_vertex_program3;\n");
            priv_ctx.target_version = NV3;
            shader_addline(buffer, "ADDRESS aL;\n");
    Code:
        enum
        {
            /* plain GL_ARB_vertex_program or GL_ARB_fragment_program */
            ARB,
            /* GL_NV_vertex_progam2_option or GL_NV_fragment_program_option */
            NV2,
            /* GL_NV_vertex_program3 or GL_NV_fragment_program2 */
            NV3
        } target_version;
    I did not go deep into the code, just a couple of searches, but you either get plain ARB_vertex/fragment program (meaning ATI/AMD) or NV_* extensions on Nvidia. I guess that if you delete all the NV_* handling, it will break the same as on ATI/AMD.
    Last edited by haplo602; 02-08-2013 at 05:20 AM.

  10. #10
    Join Date
    Jul 2011
    Location
    Prague
    Posts
    2

    Default

    Quote Originally Posted by Bitiquinho View Post
    Personally i'm impressed with r600g performance with Wine. I was able to play two Tomb Raider series games (Legend and Anniversary Edition) in a HD 5650 on my HP notebook with mesa-git and they ran really smooth (I don't know if this is great at all, but I was expecting a lot less)

    Additionally, I tested Windows version of Heroes of Newerth in Wine and got nearly the same fps (around 30) for Direct3D and OpenGL rendering (in the native version too)
    The same with me. Steam for Linux working without issue, CS:S, X3. CS:GO under wine too. I am using r600g with HD4850 (Mesa 9.0.2). The only problems I have is with slow Trine2 and incorrectly rendered Arma2OA under Wine (fragment shaders using too many registers).

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •