No announcement yet.

Blizzard Still Has a World of Warcraft Linux Client

  • Filter
  • Time
  • Show
Clear All
new posts

  • Blizzard Still Has a World of Warcraft Linux Client

    Phoronix: Blizzard Still Has a World of Warcraft Linux Client

    For years its been said that Blizzard has developed a Linux client for its very popular World of Warcraft MMORPG game but that it's never been publicly released. It turns out that this appears to still be the case that internally they have a Linux build of World of Warcraft but as of yet they have decided against releasing it to the public...

    Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite

  • #2
    Games on Linux

    Yeah.. fragmentation/openness/whatever is both a strength and weakness of the free software and/or Linux ecosystem.

    However.. if studios were to target a distro, no doubt it would be Ubuntu.

    Regardless of that though. Target PulseAudio for sound, SDL for input, OpenGL for graphics? Something like that. I guess those things might be moving targets as well. If that's the case, maybe the nerds making this stuff need to set standards that don't change A bit like creating DirectX versions, but for Linux.

    I'm sure all this has been said before too. *grabs fire extinguisher*


    • #3
      Well, exactly, they don't have to target to specific distro, just put requirements on libs, eg. glibc, kernel, opengl, sdl or whatever they need... OK, that is not user friendly on the other hand Targeting distros who are moving really fast comparing to mac or win, is neverending story...

      There are few linux gamers because if you look at the popular games - there is nothing really to play in linux. When there will be smth for linux people will play it... From company perspective that is unknown amount, of course, they don't know if that will cover their expenses or the demand will be enough, BUT hey - Mac is using opengl, so that code migh be shared across every OS, which means that it will not be very hard to support linux as well...

      But there's the story anout GL drivers in linux, everyone in Linux shouts out no blobs and stuff at the same time when free drivers are crap in terms of GL performance... Noone is ever going to build anything for linux in 3D if there are no decent drivers! So just DON'T prohibit blobs!

      If we are looking at PluseAudio, it's not gonna work well, because that is making lags when playing (software mixing, in terms of performance it's crap but features are great), I have tested that on quakes (it was like 2 years ago, but still...)

      I can give some examples which are nice, quakes, all of them can be played in modern Linux (using alsa), serious sam's as well... Those games dates even back to 1999... So actually the key stuff is OGL, SDL which are pretty standard.

      Linux is already and coming even more user friendlier and affordable (read: not having to have masters degree in Computer Science to use it) every day, so I won't be surprised that one day there will be smth more serious on linux as well


      • #4
        I'm not a Linux gamer. I'm an Ubuntu gamer. In the proprietary games market, especially with DRM, I don't think you'll find many Fedora and CentOS users.


        • #5
          Correct me if I'm wrong, but wouldn't Blizzard's argument mainly apply if they wanted to integrate the game into the system and use existing system packages/libraries to play the game? If they were so concerned about different distro's and what not, couldn't they just use all of their own libraries and what not and what not except for ones that are ubiquitous across all Linux desktops like the kernel, X, etc.?


          • #6
            One of the most annoying problems is that of incompatible base library versions. It goes like this:

            1. Your system distro provides a certain version of a base library that provides special access to hardware "stuff", be it sound (OpenAL, PulseAudio, etc), graphics (GTK+, OpenGL, Qt, etc), or even less hardware-oriented stuff, like desktop integration (dbusmenu, gnome, kde, etc).

            2. A binary-only application (closed source, or open source with extremely difficult build system) ships with its own isolated "world". This world may or may not include the base system libraries that are purported to provide hardware access. But it's a damned if you do, damned if you don't situation:

            2(a). If the binary distributor does provide e.g. OpenAL, GTK+, PulseAudio, etc. libraries, then they run the risk that the version they distribute will not work with your kernel / hardware / etc., or that it will be incompatible with running daemons (X server in the case of GTK+, Pulseaudio server in the case of sound, etc.)

            2(b). If the binary distributor does not provide e.g. OpenAL, GTK+, PulseAudio, etc., then they rely on the system distributor to provide an ABI compatible version that can be linked in, either via dlopen or the dynamic linker. If an ABI-compatible version is available, then you're usually OK -- but not quite, and not always. I'll explain in 2(c). If an ABI-compatible version is not available, then the app simply doesn't start, reporting a missing library or missing symbol within said library.

            2(c). Even if you have an ABI-compatible version of all shared libraries that the distributor expects to be available, things can still go awry! One, libstdc++ frequently changes ABI, and it is impossible to link in two versions of libstdc++ into the same process space. So if your system library (or any of the libraries it depends upon) links against libstdc++, or some other core system library that changes ABI incompatibly, then it'll break. This can even trip up libraries that don't normally link against libstdc++. For example, is the ALSA-lib userspace library. This by itself doesn't link against libstdc++. But, if you compile in support for JACK2, the ALSA PCM JACK plugin is dynamically loaded, and -- you guessed it -- links against libstdc++! This will cause a very cryptic and difficult to debug breakage deep in the dynamic linker, and the app will refuse to start.

            So as you can see, there is some truth to the fact that distributing a closed-source application with substantial system dependencies (i.e., dependencies that are expected to be provided by the distro) inherently limits you to only a finite set of distros. Distros that are either too old or too new will fail to work properly.

            The solution is something like Windows' Side-by-Side configuration, which, while ugly, inefficient, and hackish, provides a proper solution to this problem without addressing the underlying issue. Still, it makes it much easier to distribute binaries on Windows.


            • #7
              Other companies seem to be able to release proprietary linux programs, which work fine pretty much everywhere. Like for example Heroes 3 (which is made for Linux 2.2 if I remember correctly, still works perfect) and matlab. I'm guessing there are more, I just don't use that many proprietary programs. ^^

              And for a program like wow, I think they'd get by just officially supporting Ubuntu...


              • #8
                I very understand that allquixotic said, but still, this IS working, I already mentioned quakes (all of them), serious sams (all of them) and heroes (which are not 3D, but working)... SO that issue is not that big of an issue even on linuxes 10 years back and now. So theoretically it can go wrong, but it doesn't in real life for games.

                Anyhow, if blizzard could make (and they can, but don't want probably or there is some agreement with apple/ms) client for linux, I would buy it myself Not wow, but DiabloIII at least ...


                • #9
                  I disagree that it's more difficult to target Linux. It's the distro's job to do the packaging. Even Gentoo has mechanisms to support installing binaries to /opt and even, if absolutely necessary, handle static linking. It's the company's responsibility to maintain their shit and work with the development path of upstream libs and to a lesser extent downstream distributors. Developers of commercial software always seem incredibly lazy at maintainence in contrast to the prompt response of open source developers.

                  Hell last I checked Diablo II is on version 1.13b or something... they've only patched it a total of 13 times in ~10 years? That's a little _too_ lazy. Nobody can guarantee no ABI breakage for that amount of time... sorry.


                  • #10
                    Just to reinforce what I just wrote:

                    Jump ahead to 2015. Distros shipping the latest n' greatest (Ubuntu, Fedora, OpenSUSE, etc) are using new ABI versions of core system libraries. By 2015 it's safe to assume that libstdc++ will have bumped again; it's also possible that an X server may not even be running if Wayland takes off. It might even go through such upheaval that the glibc ABI changes, though this one is extremely stable and the least likely to have changed by 2015.

                    Now try to run a game from 2007 that uses C++. Try it. Regardless of whether or not libstdc++ is statically linked into the 2007-era binary executable, or loaded at runtime, you will encounter a problem.

                    (a) If libstdc++ is statically linked, then attempting to map in any libraries that use a different version of libstdc++ will fail. Since the system version of libstdc++ (the one whose ABI is required by your system libs) is not ABI-compatible with the one statically linked into the binary, it is not possible for the libraries to get mapped into the binary, and the program will not start.

                    (b) If libstdc++ is dynamically linked, then the developer had better ship their own with the distribution of their software, or it's only going to work on a very small number of distros that were explicitly tested. So let's assume they ship a libstdc++. The executable links against that with LD_LIBRARY_PATH in a wrapper script. OK, now what? We have an old libstdc++, again, in the process space. Trying to load our system libs gives us the new libstdc++ from /usr/lib{32,64}. Boom.

                    Basically, every Linux binary that links against system libs more than is a ticking time bomb. At some point in the not too distant future, an incompatible ABI break will occur in the libraries that the binary is dependent upon, and it will fail to start.

                    This is not an issue at all with open source projects, because you just recompile it against the latest toolchain, and fix any build errors that crop up. The process is quite easy. But from a software distributor's perspective, it's a royal pain in the ass, because it requires constant maintenance to keep up with the latest distros. And if the ABI breakages are especially frequent for some reason, then you end up with a laundry list in your download page. For Ubuntu versions 8.04 to 9.04 and Fedora 9 through 11, download this binary. For Ubuntu versions 9.10 through 10.10 and Fedora 12 through 14, download this binary. And so on and so forth. Distributors hate that -- and rightly so.

                    To date, the only binary package distributor that seems to get it 100% right with binary packages is Sun/Oracle, especially with VirtualBox. Their build system must be unimaginably atrocious, but they support a ton of recent distros with packages built explicitly for each distro of interest. I have yet to install a mainstream distro that they don't support. Take a look for yourself:

                    Eighteen separate packages!

                    The other amazingly cool thing about the VirtualBox binary packages is that they only support distros that are currently within the post-release update cycle from the distributor. So when Fedora 12 got EOL'ed, they stopped making Fedora 12 builds of VirtualBox. Keeping up with all of that is nothing short of amazing.

                    But we can't expect every Joe who wants to distribute a Linux game to navigate this morass, and create a similarly masterful build system, and distribute updates on a 3 to 6 month schedule to keep them working properly with the latest distros. And you still have the issue of people who insist on using ancient distros (quite a few), eventually losing support unless you backport application changes and make packages for their distro.

                    It's a terrible situation, and the only hackjob of a semi-solution would be to "just do it" the nasty Microsoft way. That is, you would have to make Side-by-Side Configuration (i.e., multiple versions of the same shared library loaded into a process's address space, mapped out differently for different components of the process) a design requirement of your dynamic linkage system and file format. Then you would have to work backwards from there to determine what changes would be necessary to ELF,, glibc, the kernel, and so on. Not to mention that multiple versions of the same library could have conflicting resource usage at the syscall level, such as file descriptors and temporary files, and you'd need a way to account for that.

                    What you end up with is the unfortunate realization that the core GNU/Linux operating system's plumbing is fundamentally unprepared for a long-term binary ecosystem. It is unfortunate for people looking to make money by selling per-user licenses of proprietary software to GNU/Linux users. It is advantageous to those willing to release source code, because the infrastructure built up around easy distribution for open source projects is quite good.