Originally posted by ssokolow
View Post
Announcement
Collapse
No announcement yet.
Steam Survey Reports The Latest Linux Gaming Marketshare For October
Collapse
X
-
Originally posted by oiaohm View PostSorry user cannot opt in if you do not provide the file. I did not say that user did not opt in for it. But when you are making up your package that you are going to offer to the user as options you can be intentionally building a debugging packages to ship to users who wish to opt in to solve their problems.
Originally posted by oiaohm View PostNo symbol conflict but still fails to run correctly and still crashes. gtk3_init will ball out.
Originally posted by oiaohm View PostWhat a correct should look like one of the following.
1)Linux app->libgtk2-filter->gtk2->libfoo->gtk3
2)Linux app->gtk2(versioned)->libfoo->gtk3
no symbol conflit but still crash like windows.
You realize gtk was just an example, right? Why are you arguing about gtk's library design itself?!?
BTW I still haven't had problems with gtk apps on Windows so idk maybe it doesn't always trigger. I don't care about this topic to continue it though, it has nothing to do with the thread and you fixate on an EXAMPLE (which happened to be gtk).
Originally posted by oiaohm View PostSo when you perform a new in C++ you are performing a wrapper over malloc on those platforms. C++ delete is just a wrapper over free.
Weasel as I said using the global namespace I can replace malloc in the C++ runtime on Linux. Because it really does using the platform libc malloc and free or what ever the current global is.
Originally posted by oiaohm View PostSo you have libc++ and libstdc++ in one program and they will be using the same memory allocation system so when you share pointers they work. Yes if you allocation something not a class libstdc++ using new on can use delete in libc++ under Linux no problems. This is using two different c++ runtimes by two completely different developers. Yes you do see libc++ static linked into some libraries.
Stop treating malloc/free as special. You have a fucking fixation on them. Symbol conflicts are more than just your fucking libc which you consider "global" and "special". It is EVERYWHERE. Global namespace is EVERYWHERE not just glibc.
Comment
-
Originally posted by Weasel View PostI'm not sure what your point is. There's plenty of ways to debug "debug built" apps, function relays etc. Because, since they are debug builds, they are MADE to be debugged. You do *NOT* need a built-in DEFAULT FEATURE for that. Your excuse is that ELF helps with debugging. Remind me why the FUCK is a debug-only feature the god damn DEFAULT for NON-DEBUG BUILDS. It shouldn't even be part of the loader.
http://engineering.appfolio.com/appf...alloc-jemalloc
This kind of stuff you don't need to change the binaries you are depending on globals. The reasons why visual studio runtimes have different malloc/free... is that they went after performance. Yet there is no framework to plug in different versions across all versions visual studio.
Originally posted by Weasel View PostThis has nothing to do with DLLs, then. But with a shit library design. You can make a library that calls ExitProcess or exit doesn't mean that DLL/ELF crash instantly and suck.
Originally posted by Weasel View PostYou realize gtk was just an example, right? Why are you arguing about gtk's library design itself?!?
Originally posted by Weasel View PostBTW I still haven't had problems with gtk apps on Windows so idk maybe it doesn't always trigger. I don't care about this topic to continue it though, it has nothing to do with the thread and you fixate on an EXAMPLE (which happened to be gtk).
Originally posted by Weasel View PostThere's literally no guarantee they are a simple wrapper over malloc/free, not in the language standard nor anywhere else..
So your "nor anywhere" is wrong. You would have a Posix non conforming C++ runtime if you don't use normal malloc, realloc and free. This is one of the differences you have to be aware about when you talk about Linux or Unix or OS X... Most platforms follow the posix standard guidance on memory to have memory sanity.Last edited by oiaohm; 10 November 2018, 03:30 PM.
Comment
-
Originally posted by Weasel View PostWell, I wasn't talking about browsers tho. Most apps aren't even online nor need online access, but of course, that's on the desktop not some shitty mobile environment.
Wayland's restrictions came about because the X11 SECURITY extension flopped for being opt-in (eg. Chrome won't run with it enabled) while Microsoft and Apple are trying to extend the Android and iOS security model to desktop applications. (That's why Wayland is a key component of Flatpak's security model.)
The shift from X11's "enable everything and let the clients sort it out" model to Wayland's "only add new APIs after we're confident that they're safe" approach is basically the same thing as the shift from Mozilla's old XUL-based APIs to Chrome's APIs. (With Firefox being KDE and working hard to extend the design with new APIs everyone wants that GNOME/Chrome isn't interested in offering.)
TL;DR: Wayland is part of a push to remain relevant as the rest of the ecosystem changes people's expectations for how easy and safe it should be to install an application. (Do you want Linux to be the Bugzilla of platforms, scaring off potential bug-reporters by offering up user account e-mails in completely un-protected mailto: links as if it's still 1999?)Last edited by ssokolow; 10 November 2018, 04:22 PM.
Comment
-
Originally posted by ssokolow View PostWayland's restrictions came about because the X11 SECURITY extension flopped for being opt-in (eg. Chrome won't run with it enabled) while Microsoft and Apple are trying to extend the Android and iOS security model to desktop applications. (That's why Wayland is a key component of Flatpak's security model.)
Such security is better served by unplugging your internet or not turning your PC on at all IMO.
Originally posted by ssokolow View PostTL;DR: Wayland is part of a push to remain relevant as the rest of the ecosystem changes people's expectations for how easy and safe it should be to install an application. (Do you want Linux to be the Bugzilla of platforms, scaring off potential bug-reporters by offering up user account e-mails in completely un-protected mailto: links as if it's still 1999?)
That's what services like virustotal are for, and if you don't trust those either (because they often miss new malware) then install in a Virtual Machine. That's what they are for.
Comment
-
Originally posted by oiaohm View PostIf your program wants to run a optimised malloc/realloc/free combination is another reason why having the default global functions is good.
http://engineering.appfolio.com/appf...alloc-jemalloc
This kind of stuff you don't need to change the binaries you are depending on globals. The reasons why visual studio runtimes have different malloc/free... is that they went after performance. Yet there is no framework to plug in different versions across all versions visual studio.
When an application is built, it should be perfectly known what the target environment should be like.
Changing the runtime of an application to what it doesn't expect as a DEFAULT FEATURE is simply insanity, the purest definition of it.
I mean, you can STILL do it even without this insane feature, by going out of your way and hooking all the imports. This is possible on Windows also. You have specialized hooks and DLL injections that do the job. That's sane because the user has to go out of his way to do it; it's not THE DEFAULT. It's not built into the loader. It's something the user has to do (or a malware behind his back) so it makes it SANE.
But... loading differently based on an environment variable... that the app wasn't designed for... seriously, no words.
Originally posted by oiaohm View PostThat is the catch. It is in standard. Posix platform standard.
Most of POSIX is insane and riddled with trash design, much worse than the Win API is.
Every platform you listed that adheres to POSIX is because of common history in Unix.Last edited by Weasel; 11 November 2018, 08:59 AM.
Comment
-
Originally posted by Weasel View PostOn the contrary it flopped because it was too restrictive. It literally disables even hardware acceleration of any kind (so that apps don't exploit some GPU bugs or do a denial of service). It's beyond retarded. So this only strengthens my point.
Such security is better served by unplugging your internet or not turning your PC on at all IMO.
That's why security has to be by-default. Otherwise, the cost-benefit trade-off prompts a lot of people to procrastinate or gamble on not needing it.
If you don't trust an application then don't install it on your machine it's really that simple.
That's what services like virustotal are for, and if you don't trust those either (because they often miss new malware) then install in a Virtual Machine. That's what they are for.Last edited by ssokolow; 11 November 2018, 09:55 AM.
Comment
-
Originally posted by Weasel View PostI mean, you can STILL do it even without this insane feature, by going out of your way and hooking all the imports. This is possible on Windows also. You have specialized hooks and DLL injections that do the job. That's sane because the user has to go out of his way to do it; it's not THE DEFAULT. It's not built into the loader. It's something the user has to do (or a malware behind his back) so it makes it SANE.
But... loading differently based on an environment variable... that the app wasn't designed for... seriously, no words.
(There are quite a few environment variables on Windows or POSIX which can cause crazy breakages if you muck with them willy-nilly. LD_PRELOAD isn't special in that regard.)
Comment
-
Originally posted by ssokolow View PostFunny. Usually I hear people talking about how POSIX got it right compared to Windows... though they're usually talking about how AppInit_DLLs is a global setting, defined in the registry, while LD_PRELOAD is something you can set in wrapper scripts and which integrates well with existing best practices for preventing problems caused by environment variables.
(There are quite a few environment variables on Windows or POSIX which can cause crazy breakages if you muck with them willy-nilly. LD_PRELOAD isn't special in that regard.)
But what I was referring more to API design. POSIX is simply worse in many aspects compared to Win32 environment, here are a few:- POSIX specifies special rules for C language runtimes, instead of having them as "just another library" (exceptions always suck and are less elegant, not to mention the fact they favor a specific language's runtime here, C). Meanwhile on Windows land, the runtimes are "just another library", nothing special.
- Just compare the mess that is pthreads with the elegance of Windows APIs dealing with threads and other objects; you don't even need specialized functions for each, since they're all objects in Windows.
But you know what's the best thing about the Windows API compared to POSIX? Win API doesn't claim itself as the "one sole truth of how computing should be done" unlike POSIX fanboys who think POSIX is a gift from God and its word is law.
Comment
-
Originally posted by Weasel View PostChanging the runtime of an application to what it doesn't expect as a DEFAULT FEATURE is simply insanity, the purest definition of it.
Originally posted by Weasel View PostI mean, you can STILL do it even without this insane feature, by going out of your way and hooking all the imports. This is possible on Windows also. You have specialized hooks and DLL injections that do the job. That's sane because the user has to go out of his way to do it; it's not THE DEFAULT. It's not built into the loader. It's something the user has to do (or a malware behind his back) so it makes it SANE.
https://lwn.net/Articles/691932/ This is only tip of a very big iceberg. If you are using clear os from intel the path you application loads libraries from alters based on cpu. Yes clear in fact depends on the ability to stack libraries.
The default loader contains more than just LD_PRELOAD since Clear from intel added the means to provide override libraries for performance reasons. Yes the ability to force a particular version to be loaded is useful. Its not like when make a program you can make it with the existing run-time to perform the best on all future cpus.
Originally posted by Weasel View PostEvery platform you listed that adheres to POSIX is because of common history in Unix.
Originally posted by Weasel View PostBut AppInit_DLLs are not part of any API or loader, they were part of user32, and now don't even work anymore. That's not the kind of injection I was thinking of (also to change those keys AFAIK you needed root/admin access as well, LD_PRELOAD not so much)..
Originally posted by Weasel View PostPOSIX specifies special rules for C language runtimes, instead of having them as "just another library" (exceptions always suck and are less elegant, not to mention the fact they favor a specific language's runtime here, C). Meanwhile on Windows land, the runtimes are "just another library", nothing special.
Originally posted by Weasel View PostJust compare the mess that is pthreads with the elegance of Windows APIs dealing with threads and other objects; you don't even need specialized functions for each, since they're all objects in Windows.
There is work at the moment to alter signals to file handles instead of PID under Linux. Every time you say object on Windows on a Unix system you should say file and if posix not using a file for that at some point someone will have to redesign it at some point.
Comment
Comment