Announcement

Collapse
No announcement yet.

Ryan Gordon Criticizes Open-Source Drivers Again

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • #61
    Originally posted by blackiwid View Post
    Hi I dont understand this guy. Most of the stuff is not like he says in my oppinion.

    1. drivers, ok they are slow for 3d stuff etc, but why is the openes a problem? the intel drivers are only open, ok no game more than tetris work with them, but thats mostly a hardware problem. Amd the fglrx is way more unstable then the open radeon drivers they are pretty much rock solid, if you not try the newest magic they are 100 rock-solid no crash at all.
    It's not that the openness is a problem - but as the open drivers are becoming good enough for every-day use, their shortcomings become more relevant. It used to be adequate to tell people to run the nvidia binaries if they wanted good 3D, because proprietary or not, they were the only practical option. But Nouveau has now matured enough to be a genuine option on the desktop, meaning people are less willing to use the binary drivers, even though the open ones aren't good enough for games.

    Comment


    • #62
      Originally posted by AdamW View Post
      Ignore the issue and install anyhow is almost exactly never what you actually want, and will cause all sorts of breakage later. The intricacies of interdependent apps and libraries aren't terribly apparent at first, but distributors usually do actually know what the hell they're doing, and if something conflicts or doesn't satisfy a dependency, that's usually _correct_, it's not the distributor being obnoxious just for the hell of it.

      It's pretty hard to provide a condensed explanation of why. It's something you either take on trust or learn through bitter experience. Feel free to choose either path. =)
      I am perfectly aware.
      But you know all those packages on your system? Their are dependant on the current version, or a version that has 0.1.0 increase in version number.
      Everybody that pack packages tend to do the same. Some time they really stretch it, and perhaps it will allow a version number of 0.1.2+ instead of a mere 0.1.0 :P
      This is not distributors faults, but the fault of the people who make packages. They specify that their package NEEDS something that will have a completely different version number in 6 months, even if it is just minor bugfixes that is patched on.
      Don't tell me that you never fetched a .deb package, and attempted to install it, only to find that the dependancy is set to only a special version of a certain libary? We both know that installing it will work just fine, but the pack don't.

      What you are talking about is when a package actually breaks due minor or large libary changes, which is a different issue.


      Originally posted by JanC View Post
      This "button" (wel, switch) exists in the commandline versions of the package manager, if you really want it, but as explained before making it the default/easy-to-use results into masking of packaging bugs and other bad things.
      If I can't find it in the manpage, and nobody on several IRC chans has no idea, it does not likely exist.
      Then again: It might be a .deb issue.
      Last edited by del_diablo; 08-08-2011, 07:08 PM.

      Comment


      • #63
        Originally posted by deanjo View Post
        You HOPE it is being checked by other QUALIFIED people.
        I'm not talking about average phoronix posters, but guys working at IBM, Google, Amazon and the major distros, who I do trust with checking something at least casually before installing it on 100,000 servers.

        It is also common for people employed by major distros, or people with long involvement in said distros, to be major or minor contributors to many FLOSS projects, and that they patch them very often before shipping. They read the source code and are certainly qualified.

        So yeah, I do feel safe that Dave Airlie, Alex Deucher and Marek Olsak (along with many many others) are all fiddling in there in kernel DRM and Mesa and know the source code well. They are qualified, and work for different companies (or are volunteers). I am convinced that the danger of a Bonzi Buddy or rootkit getting past ALL of them is rather small.

        There is always danger, but openness is a strength.

        Comment


        • #64
          Originally posted by ean5533 View Post
          There's no standard set of base libraries that I'm guaranteed to have on all Linux distros, so unless I'm missing something then the conclusion is that I'd have to statically link every dependency that chains up from libcairo, which is insane.
          Well yes, actually there is such a base set. Maybe "guaranteed" is too strong a word, but you can certainly assume you've got glibc, and all the usual X libraries that any modern desktop requires. That's what Firefox does, afterall - their binary packages do provide copies of some system libraries (usually statically linked), but they assume that if the user is running an Gtk+/X build of Firefox on an X desktop, they've probably got the usual X libraries, and a recent-enough copy of Gtk+ and it's associated libraries...

          Comment


          • #65
            Originally posted by del_diablo View Post
            I am perfectly aware.
            But you know all those packages on your system? Their are dependant on the current version, or a version that has 0.1.0 increase in version number.
            Everybody that pack packages tend to do the same. Some time they really stretch it, and perhaps it will allow a version number of 0.1.2+ instead of a mere 0.1.0 :P
            This is not distributors faults, but the fault of the people who make packages. They specify that their package NEEDS something that will have a completely different version number in 6 months, even if it is just minor bugfixes that is patched on.
            But is there anything wrong with this?
            There is such thing as ABI dependencies that (when user properly) say for example "0.1.0 is compatible with 0.1.2+". Meaning all versions of libfoo (0.1.0, 0.1.1, 0.1.2) are completely replace'able if they have the same ABI (for example they all provide libfoo.so.1).
            And in order to run dynamically linked binaries, it's usually sufficient to just meet their ABI deps. Right, it is irrelevant whether libfoo-0.1.0 or libfoo-0.1.2 is used, but there may be grave bugs in 0.1.0 that prevent application from running but that's is really what distribution developers are as aware of as software developers and they both usually know better than users.
            I mean, what is your point here?

            What AdamW said on package format vs dependencies is spot on.

            From other distro developer (Gentoo for that matter) point of view, I'll go as far as to say I never trusted software packages (.deb or .rpm) created by respective proprietary software vendors.
            It usually created only mess on package updates and left orphaned files on removals (those files were usually artifacts of messy pre/post-install scripts).

            It also makes me laugh hearing typical argument "we cannot provide application X for Linux because there's no standards Linux distribution". My answer to this is always - "so what? You do your job - write application and package as plain tarball, we (distributions) will do ours. Or just pick your favourite distribution and create packege for it - other distributions with wrap it around format they use".

            I really wish proprietary software vendors stopped caring about "supporting packages for distro X" and focused just on providing merely a tarball or simple pure command line installer (if pre/post-install/uninstall actions are required) with dynamically linked binaries (sic!) and providing just a list of those ABI dependencies in some README so that distro packagers can do their job and they will do this job better (those ABI deps can be figured out from DT_NEEDED so no such README is actually needed but nice to have indeed).

            However I understand the will to provide out of the box experience for end user without relying on distribution packagers, but frankly it's not going to work unless:
            a) either all binaries are linked statically and proprietary software vendor provides (de)installation script
            b) or proprierary software vendor picks one distribution (say RHEL6 or latest Ubuntu LTS) and devotes its time to supporting it (ignoring the rest of the world)

            No ideal solution for diversity.

            Sometimes they cannot simply say to their client "hey, grab the pieces and put them together yourself" -
            Last edited by reavertm; 08-08-2011, 09:00 PM.

            Comment


            • #66
              Originally posted by ean5533 View Post
              I don't deal with closed source software on Linux very often, so I've never had to think about it. However, I can immediately say that bundling libraries would get very ugly very quickly, because you don't know when to stop. That is, you don't know when you can assume a certain library will definitely be installed.

              Let's say you write an app that uses cairo. If you bundle libcairo2, you also have to make a decision about whether to bundle each of its dependencies or to assume that they exist on the system. Here are all of libcairo2's dependencies. And each of them have their own dependencies. How do you make the decision about which ones you need to bundle and which ones you can assume will already exist?
              the answer is: all of them, entire graph of dependencies.
              why ? for ugly, fucked-by-design, artificial problem can be only solved by ugly, screwy solution.

              problem is: this is not how you package software for open system.
              programs, binaries on open systems are... open. only data flow for those programs is completely unbound. this is just how open software systems work.
              you want to gallantly distribute your stuff in open system - you remove as much hacks and workarounds near end of development as you can, make your engine easily compilable and compatible with ways of how open programs treat data (for example, avoid hardcoded paths), open it up and let maintainers package it as they see fit, sell your game data.


              from user point of view, he just needs to install engine from repo and put data in correct place (like /usr/games/$CREATOR/$GAME, /opt/$CREATOR-$GAME or ~/Games/$CREATOR-$GAME for non-system-wide). those are easy steps and easily even further automatible.

              from developer point of view, you have to employ proper development practices and put your shit together. we all know that after few months, at best - years, game will become abandonware but distributors still will want to have some money of it and to shrug off all problems lower in chain, to distribution and users. so, it's your, developer, problem to ensure you making foresight, proactive decision for guaranteeing that your game will work after 1-4 years when all underlying software stack is changed.
              stop winning. do your job or at least have decency to not act as shitty corporate distributor as if someone else is liable for interacting with your business peers.

              PS: and don't say that "other systems" don't have a problem. not only they do but it's worse. in Windows® you have 1] deep, system deps that change with major version (even though API can stay the same, your app still can choke on it) 2] deps from bunch of redistibutives you keep on CD or in game folder for manual installation or half-assedly scripted one by "installer" 3] deps from game's folder which override system ones since those can be harder to provide/install (full redistribution forbidden by license) or update properly (everyone hardcodes their shit on different versions) 4) still bunch of stuff statically linked. and of course you will blame distributions, on Windows® there is no one to blame besides users and yourself.

              Comment


              • #67
                If the major browsers (Firefox, Chrome) would just get on the bandwagon and learn how to allow you to run executable binaries from the browser like you do with .exe on Windows, the packaging war would be over.

                A browser just has to do this:

                1. Download file completely
                2. chmod +x
                3. fork() and exec(), or system()

                These things can be done using either native code, wrapper libraries like Qt or GLib, or running shell commands. There are a gazillion ways to do these things.

                I can understand the desire to prevent users from inadvertently running malicious code on their machine, but being unable to run a MojoSetup .run or .bin file from Chrome or Firefox is just ridiculous. Does Windows make you go into CMD.EXE and run two or three commands to run an executable, just because Microsoft thinks that executables are a security risk? No.

                The browser (rightly) warns the user that the file may be malicious. And if the user has half a brain, they will go into the terminal and run the required commands to run it anyway, regardless of whether it is malicious. Those who don't have even half a brain aren't able to use Linux, even for non-malicious programs, because they can't figure out "how to run a .bin file" (they're expecting point-and-click like an .exe).

                Seriously, it's 2011. Where's my point-and-click runnable Linux scripts and binaries? The closest I've ever seen is Click-n-Run, and that died quickly. Why do we need all that infrastructure built around it? What's so hard about allowing the user to run a raw .bin installer?

                Yeah, I know, the installer might require terminal I/O. But maybe the launcher can be coded to automatically pop up a gnome-terminal and run the downloaded binary (inside a screen session -- we'll get to that later), then monitor the running process for any calls to X11. If it sees an X11 call it can automatically detach the screen session, close the terminal and let the binary "go graphical". But if it doesn't succeed in going graphical then the terminal's still there.

                Something like that; I'm just brainstorming. But we really need a point-and-click installer launcher that works across all distros, not just .deb or .rpm file associations.

                And in other news, .rpm is vastly superior to .deb for obvious reasons. Bring on the flamebait! But before you flame, just tell me where Debian's LZMA2 compression and binary diff generation ("delta rpms" equivalent for dpkg) are. I haven't seen heads or tails of them. Maybe Debian thinks dpkg is better because it's less featureful. It'd make perfect sense in Debuntu World, where everything is backwards: slower is faster, less usable is more usable, and Australia is the capital of the world.

                Comment


                • #68
                  Originally posted by allquixotic View Post
                  A browser just has to do this:

                  1. Download file completely
                  2. chmod +x
                  3. fork() and exec(), or system()
                  4. alert("error while loading shared libraries: cannot open shared object file: No such file or directory")

                  Comment


                  • #69
                    Originally posted by del_diablo View Post
                    I am perfectly aware.
                    But you know all those packages on your system? Their are dependant on the current version, or a version that has 0.1.0 increase in version number.
                    Everybody that pack packages tend to do the same. Some time they really stretch it, and perhaps it will allow a version number of 0.1.2+ instead of a mere 0.1.0 :P
                    This is not distributors faults, but the fault of the people who make packages. They specify that their package NEEDS something that will have a completely different version number in 6 months, even if it is just minor bugfixes that is patched on.
                    Don't tell me that you never fetched a .deb package, and attempted to install it, only to find that the dependancy is set to only a special version of a certain libary? We both know that installing it will work just fine, but the pack don't.
                    I will tell you that, because I don't run any .deb distros, and only ever have very briefly, for testing purposes. Check the line under my name. =)

                    But in general, I don't think you're right anyway. This can be the case with some poorly-done third party packages, sure. But for all the distributions I've looked at - and I do look at packages from more distributions than I actually *run* - what you describe is not the case at all for the official packages. Most distributions learned long ago that being overly specific with versioned dependencies is silly. Most dependencies in distros these days, at least mainstream ones, are unversioned or versioned very broadly, and specific versioned dependencies are the exception, used when there's a really genuine need for them.

                    The big obvious exception is dependencies on shared libraries, which are usually automated in modern distributions. There's a widely-accepted convention for versioning libraries, where bumping the major version is usually used to denote an ABI change. So whizzy_application will be written to link against libwhizzy.so.1 . libwhizzy.so.1 will be a symlink to libwhizzy.so.1.0.0 . If a minor change is made to libwhizzy, which doesn't affect ABI compatibility, libwhizzy will bump to libwhizzy.so.1.0.1 , but the libwhizzy.so.1 symlink will still be there, the app will be happy. If a major change is made which does break ABI compatibility libwhizzy will bump to libwhizzy.so.2.0.0 and there will now be a libwhizzy.so.2 symlink. This is how the library advertises that its ABI has changed: because whizzy_application is linked against libwhizzy.so.1 it will now stop working until it's recompiled (with any necessary code changes) against the new libwhizzy. Package dependencies will encode this same relationship. This is all exactly how things are supposed to work.

                    Comment


                    • #70
                      Originally posted by ean5533 View Post

                      *Note to Opera users: It's a hypothetical situation. Quell your hipster rage.
                      As an opera user this made me lol

                      Comment


                      • #71
                        So here goes me trying to be smart again...

                        I've not been using linux that horribly long, only since about 2005 for my default OS of choice, but I've been messing around with it since 2003. I realize I'm pretty new in linux land, so my opinions may be off. I mean no harm. I am a programmer, and CS major, so I do understand some of the technical implications for what I'm about to speak about.

                        Anyway, I do feel that it aught to be easier for 3rd parties to build software for linux. I love how linux is currently, but as already mentioned, does definitely have some issues as well for some purposes. I do see packaging and dependencies being a major reason why we can't get 3rd parties enthusiastic about desktop linux. Here's how I think this could be solved:

                        Firstly, I think one thing MS has done right, is the win32 api. I'm by no means saying it's perfect, it's not...not close. But the concept is good, IMHO. Look at it like this. Right now, on any desktop linux installation, you can expect to see the common GNU tools. Sure, you will have many different versions of some tools on different distros, but you know you will have some gnu tools. Also, you know that for a modern desktop, you should be able to build against the 2.6 (or 3.0) kernel, and not have issues (hopefully). Then we have all this distro fragmentation. I've used almost all of the major distros at one time or another I think. I get that people think things need to be done different, but here's what I propose. Why can't a group of the linux desktop community build a base Linux Desktop Framework (LDF) and api to run off of. I'm talking make members from all major distros get together and say ok, so let's build this set of libraries and tools, and call it the Linux Desktop Framework (LDF). In the next/first version of the framework, let's target these specific versions of libraries and tools, (sdl, gnu tools, asla, pulseaudio, etc...) and also build an API for working with that framework. Every so often (a year, 6 months, whatever) a new LDF release comes out, and at most a vendor might have to target maybe 2-3 versions of the api and framework (depending on release cycle), to get all his boilerplate stuff working good and compatible. It could work much like building against gnome or kde, but at a lower level. In fact, you could have gnome and kde (or whatever DE you choose) target these LDF's too.

                        It seems crazy at first, but I've actually put some thought into this. People I'm sure will answer with "but then we can't choose a library incompatible with LDF and even have a usable desktop". Well no kidding...but that's the case already. You can always have libraries in addition to the ones included with LDF, or even maybe have multiple versions of LDF installed. Point is, I think it would be beneficial to have a standard linux desktop base, and iterate on that, usable on all major distros. Now, distros could always package different libraries in addition, in order to have a competitive edge, or do things differently in the rest of the userland, but would always target a specific version of LDF to have compatibility with. Basically, I think desktop linux needs to agree on a base for a userland, agree to work on it collectively, and make sure to keep it as API compatible as possible going forward (like MS tries to), and iterate it just like we do for distros, however, it would be the one common thing you could always expect, besides just a 2.6 (or 3.x) kernel.

                        What problems does this actually solve? Well, you no longer have people with up to date distros having compatibility issues as often with software released by third parties, because they should be at least close to the same version of LDF, which should be using an almost completely compatible version of the API as the game/software was written against. The less things vendors have to worry about, the more likely they will be to port/write software for the platform.

                        I know what I've said is a complete pipe dream, and flies in the face of so much of what traditional linux development has been about. I understand that. However, if desktop linux is ever going to attract 3rd parties really well, in my opinion, that's what would best benefit it in that regard. I do understand it would probably KILL the way some/many people prefer linux to work.

                        Comment


                        • #72
                          it just doesn't

                          Originally posted by ean5533 View Post
                          On Windows you just pop in the install disk and it works.
                          Holy Fucks and Marbles! at least have a decency not to spew such bullcrap around here, will you ?
                          i have no world to explain how sick i'm from this "IT JUST WORKS"©® idiocity. if it "just worked" i would not be making my living from going in people's homes and offices to fix that shit. including instances of "this game just doesn't work anymore!!1" and "it wouldn't install/my system gone down after installation".
                          and it does not bring with itself a perfect astral knowledge of how to operate it either.

                          many people willing to pay inadequate sums to make it somehow usable again.

                          Originally posted by ean5533 View Post
                          Having to manually install libraries is not something that users should be required to do in order to play the latest version of Angry Birds (or whatever example game you want to use).
                          and bunch of installers for libraries lie inside my "Alice: Madness Returns" folder (and almost every other folder under "Games") just for shits and giggles, huh. and most games installers force-reinstall VC Runtime every time just for the hell of it too...
                          once back in the day i had an old game that force-installed some deep system crap from Win98 to WinXP. WinXP didn't liked that at all. great design, right ? "just-fucking-works" !

                          Originally posted by ean5533 View Post
                          Of course, the appearance of something like a Desura (or Steam*) client on Linux could make a lot of those problems go away.
                          riiight, let's "adapt" our current systems by adding a dublicate system which would hold nails and crutches game developers put better.

                          most games made to be sturdy playable about a year or so from release date, after that they are in a free voyage. someone have to support that code shitfest and here we come to the same situation with long-term game distributions on GNU/Linux as with anti-virus support & update - no one going to bother keeping those games holding on nails and crutches unless there are no nails and crutches or there is a sufficient potential market to "consume".
                          GNU/Linux is rapidly evolving in all directions OS. there is just no place for abandonware.

                          and there is NO pretty solution other that what Carmack does. either devs extend on that one or we can wait for someone like Valve indefinitely to take care for nails and crutches while devs are putting those nails and crutches, sure.

                          PS: while foreseeing some additional moral preaching about civility i'll add: no, i don't give a damn, get off from your high horse or take it like it is from there.

                          Comment


                          • #73
                            congratulations

                            Originally posted by allquixotic View Post
                            I can understand the desire to prevent users from inadvertently running malicious code on their machine, but being unable to run a MojoSetup .run or .bin file from Chrome or Firefox is just ridiculous. Does Windows make you go into CMD.EXE and run two or three commands to run an executable, just because Microsoft thinks that executables are a security risk? No.
                            i congratulate you, dear sir for you just found one of main reasons why Windows® is fucked.
                            and why people are willing to put up a good buck for cleaning their installation afterwards.

                            if you have to put something from damn web browser into your system - make sure it's data and not a program. otherwise you're doing something very stupid.

                            Comment


                            • #74
                              Originally posted by lienmeat View Post
                              I've not been using linux that horribly long, only since about 2005 for my default OS of choice, but I've been messing around with it since 2003. I realize I'm pretty new in linux land, so my opinions may be off. I mean no harm. I am a programmer, and CS major, so I do understand some of the technical implications for what I'm about to speak about.

                              Anyway, I do feel that it aught to be easier for 3rd parties to build software for linux. I love how linux is currently, but as already mentioned, does definitely have some issues as well for some purposes. I do see packaging and dependencies being a major reason why we can't get 3rd parties enthusiastic about desktop linux. Here's how I think this could be solved:

                              Firstly, I think one thing MS has done right, is the win32 api. I'm by no means saying it's perfect, it's not...not close. But the concept is good, IMHO. Look at it like this. Right now, on any desktop linux installation, you can expect to see the common GNU tools. Sure, you will have many different versions of some tools on different distros, but you know you will have some gnu tools. Also, you know that for a modern desktop, you should be able to build against the 2.6 (or 3.0) kernel, and not have issues (hopefully). Then we have all this distro fragmentation. I've used almost all of the major distros at one time or another I think. I get that people think things need to be done different, but here's what I propose. Why can't a group of the linux desktop community build a base Linux Desktop Framework (LDF) and api to run off of. I'm talking make members from all major distros get together and say ok, so let's build this set of libraries and tools, and call it the Linux Desktop Framework (LDF). In the next/first version of the framework, let's target these specific versions of libraries and tools, (sdl, gnu tools, asla, pulseaudio, etc...) and also build an API for working with that framework. Every so often (a year, 6 months, whatever) a new LDF release comes out, and at most a vendor might have to target maybe 2-3 versions of the api and framework (depending on release cycle), to get all his boilerplate stuff working good and compatible. It could work much like building against gnome or kde, but at a lower level. In fact, you could have gnome and kde (or whatever DE you choose) target these LDF's too.

                              It seems crazy at first, but I've actually put some thought into this. People I'm sure will answer with "but then we can't choose a library incompatible with LDF and even have a usable desktop". Well no kidding...but that's the case already. You can always have libraries in addition to the ones included with LDF, or even maybe have multiple versions of LDF installed. Point is, I think it would be beneficial to have a standard linux desktop base, and iterate on that, usable on all major distros. Now, distros could always package different libraries in addition, in order to have a competitive edge, or do things differently in the rest of the userland, but would always target a specific version of LDF to have compatibility with. Basically, I think desktop linux needs to agree on a base for a userland, agree to work on it collectively, and make sure to keep it as API compatible as possible going forward (like MS tries to), and iterate it just like we do for distros, however, it would be the one common thing you could always expect, besides just a 2.6 (or 3.x) kernel.

                              What problems does this actually solve? Well, you no longer have people with up to date distros having compatibility issues as often with software released by third parties, because they should be at least close to the same version of LDF, which should be using an almost completely compatible version of the API as the game/software was written against. The less things vendors have to worry about, the more likely they will be to port/write software for the platform.

                              I know what I've said is a complete pipe dream, and flies in the face of so much of what traditional linux development has been about. I understand that. However, if desktop linux is ever going to attract 3rd parties really well, in my opinion, that's what would best benefit it in that regard. I do understand it would probably KILL the way some/many people prefer linux to work.
                              Like I wrote: it's not a new idea. It's come up before. (See United Linux for only one example). The problem is, well, in the end, too many people simply disagree on what should be *in* the base. To take a trivial example: should ALSA or PulseAudio APIs be used? Or both, or neither? To take a bigger example: GTK+ or Qt? And once you magically solve all those questions - what versions? Again, remember, distros are different _because they want to be different_. Fedora, generally speaking, is going to want somewhat newer versions of stuff than Ubuntu is. So what do we do? Who do we side with? Or do we bless both, and not really solve the problem at all?

                              Ultimately it involves distributions sacrificing a lot of their independence for something they generally just don't consider terribly important: the ability of third parties to bypass their distribution mechanisms. Distributions generally reckon third parties should send them tarballs and let them deal with the distribution. Whether this is 'right' or 'wrong' is a bit simplistic and really kinda missing the point: right or wrong, it's how distros think, if you imagine them as single entities with minds.

                              Comment


                              • #75
                                compromise that not completely delusional

                                Originally posted by mirv View Post
                                So long as [insert client name here] doesn't attempt to install things itself. Perhaps it might work with Ubuntu, but I personally don't want such a client touching any more than necessary to run (which basically means touch nothing other than the games!). Let [insert client name here] handle the games, and the distro package manager handle [insert client name here]. That's how I see it anyway.
                                yep. that could be the only feasible compromise to proper open-source that i see too.

                                Comment

                                Working...
                                X