Announcement

Collapse
No announcement yet.

linux, the very weak system for gaming

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #71
    Originally posted by crazycheese View Post
    I think, the "OpenAL" part is dead, no? Just like Allegro.

    MESA is one of the implementations, which should just follow the (hard to follow) spec (OpenGL).

    This leaves developers with "X/OpenGL/SDL" tripple, which is perfectly simple.
    i think you can use openAL, was just something with it you have to use an older version

    as far as i know SDL handles making windows and such, so your left with "OpenGL/SDL"
    playing xonotic GLX gives me more fps then SDL, maybe SDL 2.0 is better performance wise ? (mind that GLX should(!) be minimum overhead and that SDL handles input and sound too)

    Comment


    • #72
      Originally posted by gamerk2 View Post
      Wrong. If I'm making some program that uses the GPU, guess what? I need to interact with the GPU drivers. Now, someone goes and changes the Kernel API, the drivers need a new update, and boom, my program stops working, or I have a massive performance regression, because some feature I was using got broken somewhere between the Kernel and driver.
      The problem here is the closed-source driver, not the change in kernel API. Do not upgrade your kernel if your closed-source driver supplier does not support it. If you don't want the driver supplier to prevent you from kernel upgrades, convince them to release an open-source driver.

      Originally posted by gamerk2 View Post
      When you change the Kernel API, you necessitate a driver redesign. When you necessitate a driver redesign, you really tick off the people who interact with said driver. Additions, fine. But you should almost NEVER remove functionallity.
      What if some API function turns out later to be poorly designed / unsafe / not general enough? By never removing any functions you prevent the removal of cruft as well as a good deal of improvement.

      Comment


      • #73
        Originally posted by nej_simon View Post
        Oh really? I've seen it happen plenty of times. Some library is changed and then your third party software breaks because it was dependent on the earlier version of the library. Sure you can link your application statically or bundle libraries but due to a lack of standardization there would be a lot of stuff to bundle
        Your response essentially says "well but I don't like bundling libraries, there's many of them". Well, you can either use the solution you don't like, but which works and everyone uses it, or keep doing it wrong and unproductively complaining. The number of libraries you'll have to bundle is exactly the same on Windows and Linux.

        Originally posted by nej_simon View Post
        Hell, even new versions of GCC sometimes break compatibility with older versions so you might have problems if your software is compiled with a different version than the system one.
        The C ABI of GCC hasn't changed in ages.
        The C++ ABI changes from time to time, but this is only relevant if you want to use libraries from the system written in C++ (see above about bundling). Typically the libraries themselves break ABI much more often than GCC. For example there's an implicit ABI break with every release of Boost.

        Originally posted by nej_simon View Post
        Except that it's not that easy. There are many RPM- and deb-based distributions and many releases of each with a varying degree of compatibility with each other. So you can't just create an RPM and then expect it to magically work across all RPM-distributions.
        I packaged an icon theme around 8 releases of Ubuntu ago and I didn't have to change a thing since then.
        You will only run into incompatibilities if you want to use the system libraries or integrate closely with some desktop features, which games don't need to do.

        Originally posted by nej_simon View Post
        So the standardized way of installing software is to just unpack it in your home dir? Right. But lets say you want to install the game so that all users can access it. How would you do that?
        Tell each user to unpack the game in his home directory. You waste some disk space, but it's very simple. If you really can't live without that $1 worth of disk space, you'll need to do some extra work.

        Originally posted by nej_simon View Post
        Compare this to what you would do in windows (run a installer and click next a few times) or OSX (just drag the app bundle to the applications dir). App installations on linux aren't an issue you say?
        You are comparing apples to oranges. On one side you have a tarred game (minimal support from the developer), on the other you have prepackaged installers (full support).

        Originally posted by nej_simon View Post
        Then why don't you go tell that to the Gnome and Ubuntu developers? I'm sure they would love to hear about it so they can stop wasting their time fixing this apparent non-issue.
        GNOME wants to avoid ABI breaks in Glib and GTK, which means you can use a newer version of GTK with a program compiled for an older version. This has exactly nothing to do with kernel API stability.

        Comment


        • #74
          Originally posted by nej_simon View Post
          Perhaps in some cases, but should a user really have to resort to repo with bleeding edge software just to get a working driver? Wouldn't it be better to support third party drivers so that the hardware manufacturer can bundle a driver with the hardware?
          Let's say there are N operating systems and M hardware manufacturers.

          You want each of M manufacturers to write a driver for N systems. For this, each of the M manufacturers must hire at least N specialist programmers (let's assume it's unlikely for a programmer to be proficient in dirver development on two systems simultaneously). In practice, they often hire non-specialists who write crappy drivers, or do not supply drivers for some of the N systems at all. Furthermore, because each manufacturer works in isolation, they cannot reuse code between similar devices from different manufacturers.

          If the each of the M hardware manufacturers instead releases documentation on what their damn hardware actually does, any competent programmer can write the driver for any of the N systems. This means that the operating system vendor or a group of volunteers can write the drivers for all M devices, exploiting all similarities between them to reduce the amount of work that needs to be done.

          Since typically M is much higher than N, in your scenario we require many more programmers - which are a scarce resource - to produce the same number of lower quality drivers.

          Comment


          • #75
            This thread looks funny after Valve announcement. It seems Linux is the BEST system for gaming. Even with much lesser optimization time and Unity not suspending compositions L4D2 runs faster than on windows! Windoze is just a crap.

            Comment


            • #76
              Originally posted by kraftman View Post
              This thread looks funny after Valve announcement. It seems Linux is the BEST system for gaming. Even with much lesser optimization time and Unity not suspending compositions L4D2 runs faster than on windows! Windoze is just a crap.
              note that i have respect for what you know and do, but you should know that you cant compare a Intel Core i7 3930K and Nvidia 680 windows vs heavily modified linux and call it a win

              on my CPU/RAM limited rig, where l4d2 runs at 60fps on windows, there would be a huuuge penalty from that 50% of openGL calling time spent translating it(even thou my monster grafics card can swallow allot more), wouldn't it

              ignoring the "my game runs bad in wine"(what is true mostly) and "there is no that program on linux"(what mostly isn't true), fact is humans are creatures of habit and do not want to learn anything else

              another i feel smaller thing is that even thou linux IS easier to install properly(dont know win7) there is a couple problems that can happen;
              example a b43 broadcom wireless that i installed for a friend on fedora: i had to go to him and install it myself, how would i explain to him "you goto get the win drivers and this program, then extract the drivers, run the program on them, and copy THAT file as root to..."(and hes bad at english)
              i know he'd fck up somewhere and i cant vnc to the comp since it has no wireless
              would be more user friendly to have a extra repository for all that crap, then i can tell him "go to that site, click there, wait, run synaptic, install that"(should be legal to make a repo with just firmware, or add the firmware directly to like RPM fusion instead of just adding bwcutter)

              since you replied to this sh***y tread(thx spammers and fangirls), il ask:

              why does compiz and such opengl(?) desktop acceleration have such penalties ?
              i got 2 3D games here and one gets some 10fps less with compiz, other one dosent (dont remember which is which so i wont name them)

              from what i know batching opengl calls gets you performance since its all sent in one blast, but should the few tiny DE calls really disrupt the flow so much that i get 15% less fps ? (yes 10fps is 15% on my comp)

              why is there a penalty anyway, shouldn't the desktop go to sleep when a DRI application goes "fullscreen"(there is no fullscreen in X afaik) ?

              thats a couple questions i got from looking at Michaels charts here, and didnt find a clear "why is that" anywhere
              (not that i care much since i tuned my comp when i was bored a long time ago, but objectively "shud that be there anyway")

              PS its not written if they used unity llvm or if they tuned other stuff, however it is written that they worked with nvidia and such (maybe it is written and im just blind, happens)

              PPS no disrespect but open source radeon drivers are, in most cases, allot slower then closed source; closed source radeon drivers are hard to install(especially on dual grafics laptops(where even lspci can lie)) and intel is... not there yet hardware wise

              so that leaves nvidia as a best expirience on linux (maybe some tried and tested amd cards on desktops can match, but realisticly theres alot more complaints about amd then nvidia)


              sry for the lenght of the post(if you'l read it anyway), i made good coffe today

              one more thing to ask(as a noob in all this, and since you aint);
              how much do drivers have to do with latencies ?

              trying nouveau i got the feeling xonotic is more responsive, allthou blocky
              that got me thinking that there's a scheduler in "the blob" that queues calls to some limit

              if it were true id switch to nouveau as latency is important in such a fast paced game

              Comment


              • #77
                Originally posted by gens View Post
                why does compiz and such opengl(?) desktop acceleration have such penalties ?
                i got 2 3D games here and one gets some 10fps less with compiz, other one dosent (dont remember which is which so i wont name them)

                from what i know batching opengl calls gets you performance since its all sent in one blast, but should the few tiny DE calls really disrupt the flow so much that i get 15% less fps ? (yes 10fps is 15% on my comp)
                There are several benchmarks for compositing on phoronix and they mostly show that the proprietary nvidia driver suffers much from compositing, fglrx hardly does and radeon sometimes even benefits.

                Example from last year:
                Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

                Comment


                • #78
                  Originally posted by gens View Post
                  note that i have respect for what you know and do, but you should know that you cant compare a Intel Core i7 3930K and Nvidia 680 windows vs heavily modified linux and call it a win
                  Since some time I do nothing. What I want is just to have fair comparisons. When comes to Valve there's Ubuntu (which is slower in 3D graphic compared to distributions which doesn't use compiz). Valve only modified L4D2 and helped with Ubuntu's graphic drivers. Canonical fixed few things in Unity or compiz.

                  why does compiz and such opengl(?) desktop acceleration have such penalties ?
                  i got 2 3D games here and one gets some 10fps less with compiz, other one dosent (dont remember which is which so i wont name them)

                  from what i know batching opengl calls gets you performance since its all sent in one blast, but should the few tiny DE calls really disrupt the flow so much that i get 15% less fps ? (yes 10fps is 15% on my comp)

                  why is there a penalty anyway, shouldn't the desktop go to sleep when a DRI application goes "fullscreen"(there is no fullscreen in X afaik) ?
                  The answer is suspend compositions. In KDE it's K - System settings - Desktop effects - advanced - "suspend desktop effects for fullscreen windows or something like that"*. You can do the same in Ubuntu, but afaik it's broken with some drivers. If you want to make some valid comparison between Linux and Windows I recommend to choose KDE or XFCE based distribution. Ubuntu is the most popular one, but some things have still to be fixed out there. Use proprietary drivers of course.

                  * - you can also try disabling vsync

                  Comment


                  • #79
                    Mutter, which GNOME Shell uses, suspends the compositor when a full screen game is running. KWin and the Xfce compositor can do the same, and this functionality is even togglable, as Kraftman noted. It is a bit unfair to paint all compositors with the same brush when the problem really is mostly with Compiz and Unity.

                    Comment


                    • #80
                      This just in: OP is a moron.

                      Comment

                      Working...
                      X