Announcement

Collapse
No announcement yet.

Ryan Gordon Criticizes Open-Source Drivers Again

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Ancurio
    replied
    Hybrid installation?

    This is going to be my first post here at Phoronix.
    I have been a linux user for almost a year, and I've had an idea that I've been thinking about consequently:

    Let's say the program to be distributed is compiled with all library dynamically linked into it.
    Then, an installer file is created which contains all data, the binary(ies) and, every single linked library there is,
    maybe even glibc. This would possibly create a huge installer, but this shouldn't be a problem nowadays.
    At install time, the installer copies all data and the binary, then creates a "lib" folder and scans the system:
    For every required library that is found, it simply puts a symlink in the lib folder, for every library it cannot find
    it copies the whole library from itself into this lib folder. A launcher script will make sure LD_LIBRARY_PATH is
    adjusted accordingly.
    Maybe, at some point, the system will be upgraded, and old libraries might get replaced with newer, ABI-incompatible ones.
    Then the program would obviously stop working. That's why the installer would contain a "fix dependencies", or just plain
    "fix problems" button by which then the system would be rescaned, and for every library that disappeared from it,
    the installer would replace the symlink in the lib folder with the actual require library, just as at install-time.

    This would provide continuous compatibility with changing systems, while still keeping the size of the installation
    as low as possible. And since most programs would be supplied by CD/DVD, I don't think the size of all
    the libraries supplied as well would be a hindrance.

    or maybe I'm just a foolish noob..

    Leave a comment:


  • curaga
    replied
    Originally posted by deanjo View Post
    Do YOU personally audit ALL the code on your system? If not then you are living under a false sense of security. There have been cases where even when the source was open it had nefarious code that has undiscovered for years.
    Having recently witnessed the smartphone app ecosystems, I have to say one side has a clear vested interest in shafting you, while the other doesn't.

    Open software written to scratch an itch - why would it try to snoop your info? A binary on the other hand, if they can earn $$ by selling your stuff, and think they won't get caught, they will.


    (Yes, a generalization, but the figures are really scary. Over 20% of smartphone apps snoop on you, and that's just those that got caught.)

    Leave a comment:


  • dfx.
    replied
    always will remain "hostile"

    Originally posted by ean5533 View Post
    Good reading. All points that seem obvious after you're read them.

    If one takes everything you've said at face value, the sad conclusion is that the divided world of Linux is fundamentally hostile to 3rd party vendors, and will remain hostile as long as it remains divided. That's quite a depressing realization to have.
    that was great post indeed. but i don't think that you should interpret it as if there were no demand for differentiation then that's all would be nice and dandy for abandonware-makers.
    putting closed-source, binary-only, hacked-together abandonware into GNU/Linux system goes against all principles it stands on. it will never work on large scale.
    either they will grant us freedoms of use and modification of their [bought] software or someone has to share & enjoy the pain of dealing with shitty development practices.

    Leave a comment:


  • lienmeat
    replied
    Originally posted by AdamW View Post
    Like I wrote: it's not a new idea. It's come up before. (See United Linux for only one example). The problem is, well, in the end, too many people simply disagree on what should be *in* the base. To take a trivial example: should ALSA or PulseAudio APIs be used? Or both, or neither? To take a bigger example: GTK+ or Qt? And once you magically solve all those questions - what versions? Again, remember, distros are different _because they want to be different_. Fedora, generally speaking, is going to want somewhat newer versions of stuff than Ubuntu is. So what do we do? Who do we side with? Or do we bless both, and not really solve the problem at all?

    Ultimately it involves distributions sacrificing a lot of their independence for something they generally just don't consider terribly important: the ability of third parties to bypass their distribution mechanisms. Distributions generally reckon third parties should send them tarballs and let them deal with the distribution. Whether this is 'right' or 'wrong' is a bit simplistic and really kinda missing the point: right or wrong, it's how distros think, if you imagine them as single entities with minds.
    Of course it's not a new idea. I wouldn't suggest it is. I was mainly just trying to say that I believe that is the only path toward a solution. The only reason linux can't/hasn't/won't get it right is because people refuse to compromise even if it would make a better experience for everyone but them (speaking of distros, and project managers). Like I said in the first post though, I see no reason why a distro couldn't include newer libs in addition to what the base they targeted already provided. All that's needed is SOMETHING common to target when writing software. Just so devs can say "OK, it looks like 75% of linux desktop users use LDF version x or newer" and just target the oldest one they wish. Newer ones should mostly work without problems if they try to keep API compatible going forward (which I think is very important as well).

    Leave a comment:


  • dfx.
    replied
    compromise that not completely delusional

    Originally posted by mirv View Post
    So long as [insert client name here] doesn't attempt to install things itself. Perhaps it might work with Ubuntu, but I personally don't want such a client touching any more than necessary to run (which basically means touch nothing other than the games!). Let [insert client name here] handle the games, and the distro package manager handle [insert client name here]. That's how I see it anyway.
    yep. that could be the only feasible compromise to proper open-source that i see too.

    Leave a comment:


  • AdamW
    replied
    Originally posted by lienmeat View Post
    I've not been using linux that horribly long, only since about 2005 for my default OS of choice, but I've been messing around with it since 2003. I realize I'm pretty new in linux land, so my opinions may be off. I mean no harm. I am a programmer, and CS major, so I do understand some of the technical implications for what I'm about to speak about.

    Anyway, I do feel that it aught to be easier for 3rd parties to build software for linux. I love how linux is currently, but as already mentioned, does definitely have some issues as well for some purposes. I do see packaging and dependencies being a major reason why we can't get 3rd parties enthusiastic about desktop linux. Here's how I think this could be solved:

    Firstly, I think one thing MS has done right, is the win32 api. I'm by no means saying it's perfect, it's not...not close. But the concept is good, IMHO. Look at it like this. Right now, on any desktop linux installation, you can expect to see the common GNU tools. Sure, you will have many different versions of some tools on different distros, but you know you will have some gnu tools. Also, you know that for a modern desktop, you should be able to build against the 2.6 (or 3.0) kernel, and not have issues (hopefully). Then we have all this distro fragmentation. I've used almost all of the major distros at one time or another I think. I get that people think things need to be done different, but here's what I propose. Why can't a group of the linux desktop community build a base Linux Desktop Framework (LDF) and api to run off of. I'm talking make members from all major distros get together and say ok, so let's build this set of libraries and tools, and call it the Linux Desktop Framework (LDF). In the next/first version of the framework, let's target these specific versions of libraries and tools, (sdl, gnu tools, asla, pulseaudio, etc...) and also build an API for working with that framework. Every so often (a year, 6 months, whatever) a new LDF release comes out, and at most a vendor might have to target maybe 2-3 versions of the api and framework (depending on release cycle), to get all his boilerplate stuff working good and compatible. It could work much like building against gnome or kde, but at a lower level. In fact, you could have gnome and kde (or whatever DE you choose) target these LDF's too.

    It seems crazy at first, but I've actually put some thought into this. People I'm sure will answer with "but then we can't choose a library incompatible with LDF and even have a usable desktop". Well no kidding...but that's the case already. You can always have libraries in addition to the ones included with LDF, or even maybe have multiple versions of LDF installed. Point is, I think it would be beneficial to have a standard linux desktop base, and iterate on that, usable on all major distros. Now, distros could always package different libraries in addition, in order to have a competitive edge, or do things differently in the rest of the userland, but would always target a specific version of LDF to have compatibility with. Basically, I think desktop linux needs to agree on a base for a userland, agree to work on it collectively, and make sure to keep it as API compatible as possible going forward (like MS tries to), and iterate it just like we do for distros, however, it would be the one common thing you could always expect, besides just a 2.6 (or 3.x) kernel.

    What problems does this actually solve? Well, you no longer have people with up to date distros having compatibility issues as often with software released by third parties, because they should be at least close to the same version of LDF, which should be using an almost completely compatible version of the API as the game/software was written against. The less things vendors have to worry about, the more likely they will be to port/write software for the platform.

    I know what I've said is a complete pipe dream, and flies in the face of so much of what traditional linux development has been about. I understand that. However, if desktop linux is ever going to attract 3rd parties really well, in my opinion, that's what would best benefit it in that regard. I do understand it would probably KILL the way some/many people prefer linux to work.
    Like I wrote: it's not a new idea. It's come up before. (See United Linux for only one example). The problem is, well, in the end, too many people simply disagree on what should be *in* the base. To take a trivial example: should ALSA or PulseAudio APIs be used? Or both, or neither? To take a bigger example: GTK+ or Qt? And once you magically solve all those questions - what versions? Again, remember, distros are different _because they want to be different_. Fedora, generally speaking, is going to want somewhat newer versions of stuff than Ubuntu is. So what do we do? Who do we side with? Or do we bless both, and not really solve the problem at all?

    Ultimately it involves distributions sacrificing a lot of their independence for something they generally just don't consider terribly important: the ability of third parties to bypass their distribution mechanisms. Distributions generally reckon third parties should send them tarballs and let them deal with the distribution. Whether this is 'right' or 'wrong' is a bit simplistic and really kinda missing the point: right or wrong, it's how distros think, if you imagine them as single entities with minds.

    Leave a comment:


  • dfx.
    replied
    congratulations

    Originally posted by allquixotic View Post
    I can understand the desire to prevent users from inadvertently running malicious code on their machine, but being unable to run a MojoSetup .run or .bin file from Chrome or Firefox is just ridiculous. Does Windows make you go into CMD.EXE and run two or three commands to run an executable, just because Microsoft thinks that executables are a security risk? No.
    i congratulate you, dear sir for you just found one of main reasons why Windows? is fucked.
    and why people are willing to put up a good buck for cleaning their installation afterwards.

    if you have to put something from damn web browser into your system - make sure it's data and not a program. otherwise you're doing something very stupid.

    Leave a comment:


  • dfx.
    replied
    it just doesn't

    Originally posted by ean5533 View Post
    On Windows you just pop in the install disk and it works.
    Holy Fucks and Marbles! at least have a decency not to spew such bullcrap around here, will you ?
    i have no world to explain how sick i'm from this "IT JUST WORKS"?? idiocity. if it "just worked" i would not be making my living from going in people's homes and offices to fix that shit. including instances of "this game just doesn't work anymore!!1" and "it wouldn't install/my system gone down after installation".
    and it does not bring with itself a perfect astral knowledge of how to operate it either.

    many people willing to pay inadequate sums to make it somehow usable again.

    Originally posted by ean5533 View Post
    Having to manually install libraries is not something that users should be required to do in order to play the latest version of Angry Birds (or whatever example game you want to use).
    and bunch of installers for libraries lie inside my "Alice: Madness Returns" folder (and almost every other folder under "Games") just for shits and giggles, huh. and most games installers force-reinstall VC Runtime every time just for the hell of it too...
    once back in the day i had an old game that force-installed some deep system crap from Win98 to WinXP. WinXP didn't liked that at all. great design, right ? "just-fucking-works" !

    Originally posted by ean5533 View Post
    Of course, the appearance of something like a Desura (or Steam*) client on Linux could make a lot of those problems go away.
    riiight, let's "adapt" our current systems by adding a dublicate system which would hold nails and crutches game developers put better.

    most games made to be sturdy playable about a year or so from release date, after that they are in a free voyage. someone have to support that code shitfest and here we come to the same situation with long-term game distributions on GNU/Linux as with anti-virus support & update - no one going to bother keeping those games holding on nails and crutches unless there are no nails and crutches or there is a sufficient potential market to "consume".
    GNU/Linux is rapidly evolving in all directions OS. there is just no place for abandonware.

    and there is NO pretty solution other that what Carmack does. either devs extend on that one or we can wait for someone like Valve indefinitely to take care for nails and crutches while devs are putting those nails and crutches, sure.

    PS: while foreseeing some additional moral preaching about civility i'll add: no, i don't give a damn, get off from your high horse or take it like it is from there.

    Leave a comment:


  • lienmeat
    replied
    So here goes me trying to be smart again...

    I've not been using linux that horribly long, only since about 2005 for my default OS of choice, but I've been messing around with it since 2003. I realize I'm pretty new in linux land, so my opinions may be off. I mean no harm. I am a programmer, and CS major, so I do understand some of the technical implications for what I'm about to speak about.

    Anyway, I do feel that it aught to be easier for 3rd parties to build software for linux. I love how linux is currently, but as already mentioned, does definitely have some issues as well for some purposes. I do see packaging and dependencies being a major reason why we can't get 3rd parties enthusiastic about desktop linux. Here's how I think this could be solved:

    Firstly, I think one thing MS has done right, is the win32 api. I'm by no means saying it's perfect, it's not...not close. But the concept is good, IMHO. Look at it like this. Right now, on any desktop linux installation, you can expect to see the common GNU tools. Sure, you will have many different versions of some tools on different distros, but you know you will have some gnu tools. Also, you know that for a modern desktop, you should be able to build against the 2.6 (or 3.0) kernel, and not have issues (hopefully). Then we have all this distro fragmentation. I've used almost all of the major distros at one time or another I think. I get that people think things need to be done different, but here's what I propose. Why can't a group of the linux desktop community build a base Linux Desktop Framework (LDF) and api to run off of. I'm talking make members from all major distros get together and say ok, so let's build this set of libraries and tools, and call it the Linux Desktop Framework (LDF). In the next/first version of the framework, let's target these specific versions of libraries and tools, (sdl, gnu tools, asla, pulseaudio, etc...) and also build an API for working with that framework. Every so often (a year, 6 months, whatever) a new LDF release comes out, and at most a vendor might have to target maybe 2-3 versions of the api and framework (depending on release cycle), to get all his boilerplate stuff working good and compatible. It could work much like building against gnome or kde, but at a lower level. In fact, you could have gnome and kde (or whatever DE you choose) target these LDF's too.

    It seems crazy at first, but I've actually put some thought into this. People I'm sure will answer with "but then we can't choose a library incompatible with LDF and even have a usable desktop". Well no kidding...but that's the case already. You can always have libraries in addition to the ones included with LDF, or even maybe have multiple versions of LDF installed. Point is, I think it would be beneficial to have a standard linux desktop base, and iterate on that, usable on all major distros. Now, distros could always package different libraries in addition, in order to have a competitive edge, or do things differently in the rest of the userland, but would always target a specific version of LDF to have compatibility with. Basically, I think desktop linux needs to agree on a base for a userland, agree to work on it collectively, and make sure to keep it as API compatible as possible going forward (like MS tries to), and iterate it just like we do for distros, however, it would be the one common thing you could always expect, besides just a 2.6 (or 3.x) kernel.

    What problems does this actually solve? Well, you no longer have people with up to date distros having compatibility issues as often with software released by third parties, because they should be at least close to the same version of LDF, which should be using an almost completely compatible version of the API as the game/software was written against. The less things vendors have to worry about, the more likely they will be to port/write software for the platform.

    I know what I've said is a complete pipe dream, and flies in the face of so much of what traditional linux development has been about. I understand that. However, if desktop linux is ever going to attract 3rd parties really well, in my opinion, that's what would best benefit it in that regard. I do understand it would probably KILL the way some/many people prefer linux to work.

    Leave a comment:


  • bwat47
    replied
    Originally posted by ean5533 View Post

    *Note to Opera users: It's a hypothetical situation. Quell your hipster rage.
    As an opera user this made me lol

    Leave a comment:

Working...
X