Originally posted by aht0
View Post
1.
Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".
You get counter examples to this all the time like command.com, all Unix shells, databases, busybox... Making every program do one thing well can equal bigger memory foot print and more performance overhead. So following this can be completely wrong.
Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input.
This can lead to not creating libraries or not providing IPC interfaces when you should. This is a memory foot print and operation tasking problem that can kill performance. Binary input formats can be way more compact. The don't insist on interactive input is what has lead to some of the security problems. Everything in this has to be taken with a grain of salt as it can lead to you doing programs wrong.
Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.
First part of this is ok and should be a objective. Second part take with a strict grain of salt as at times you have to keep clumsy parts for application compatibility. So it might pay to have a directive here to spend a little validating your design idea to avoid clumsy parts up front if possible.
Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them
This directive backfires. First part kind of ok. But adding tools to a project that you that you expect to throw out latter is a path to documentation hell in future for end users.
2.
Make it easy to write, test, and run programs.
This lacks key stuff it why we have C. Yes C is fairly easy to write test and run programs. It also makes it insanely simple to screw it up completely.
Really this need to be "Make it easy to write, test and run programs that by design is low in defects" what we are starting to go after with
Interactive use instead of batch processing
If you have not noticed this one conflicts with "Don't insist on interactive input." Expect the output of every program to be input to another is about batch processing. This is not uncommon with those writing different so called Unix philosophy to in fact have very different points of view. There is such thing as too far batch processing just as there is such thing as too far interactive.
Economy and elegance of design due to size constraints ("salvation through suffering").
This lead to our buffer overflow bugs. Where pointers to stuff never had a size value because having a size value would increase space required. Thank you for being unwilling to waste a little bit of memory and processing power for these size constraints is the cause of 80%+ of all the reported security bugs. So this idea is not exactly great.
Self-supporting system: all Unix software is maintained under Unix
This one is when you really scratch head why do we have independent project at all if you follow that statement. Yes that says there should be no third party software at all. Does this sound like a really sane idea to you. 100 percent does not to me. Was sane at the time not now.
Originally posted by aht0
View Post
We really do need to sit down some how and write a new set of design rules that everyone agrees on. The history items like Unix philosophy have bugs sometimes unattended ones.
Leave a comment: