OpenZFS 2.3-rc3 Adds JSON Output For Commonly Used Commands

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts
  • skeevy420
    Senior Member
    • May 2017
    • 8561

    #11
    Originally posted by stormcrow View Post

    Good luck. I don't believe Michael even uses a spell checker, let alone a grammar checker, despite the ubiquity of the first in nearly every text editor or processor while the latter is almost as easily obtained.
    Phoronix -- Where every article is written on Notepad from Windows 3.1

    Comment

    • mb_q
      Senior Member
      • May 2017
      • 222

      #12
      Originally posted by archkde View Post

      JSON is not that bad too parse and there are libraries for nearly all languages. Or do you mean that it's funny that this didn't happen before JSON even though there have been other easy to parse formats before?
      Yeah, it just doesn't support integers and binary data (you cannot even represent arbitrary Linux path in JSON due to the fact that JSON requires UTF-8 and Linux doesn't) and there is no standard schema system and often no schema whatsoever, so the conversion still has to be manual. The robust solution for programmable administration is to use OS APIs in you language, not to autodrive CLI tools.

      Comment

      • intelfx
        Senior Member
        • Jun 2018
        • 1083

        #13
        Originally posted by mb_q View Post
        The robust solution for programmable administration is to use OS APIs in you language, not to autodrive CLI tools.
        Last time someone tried to introduce reasonable OS APIs for Linux, the collective butts got hurt so bad that more than a decade later the bitching, whining and moaning shows not a hint of exhaustion.
        Last edited by intelfx; 09 November 2024, 01:37 PM.

        Comment

        • caligula
          Senior Member
          • Jan 2014
          • 3313

          #14
          Originally posted by archkde View Post
          Or do you mean that it's funny that this didn't happen before JSON even though there have been other easy to parse formats before?
          Yes. E.g. try to create csv reports with powertop. This is a great example of unix crap.

          Comment

          • Old Grouch
            Senior Member
            • Apr 2020
            • 674

            #15
            Originally posted by mb_q View Post

            Yeah, it just doesn't support integers and binary data (you cannot even represent arbitrary Linux path in JSON due to the fact that JSON requires UTF-8 and Linux doesn't) and there is no standard schema system and often no schema whatsoever, so the conversion still has to be manual. The robust solution for programmable administration is to use OS APIs in you language, not to autodrive CLI tools.
            Given those apparent failings of JSON, and in the absence of OS APIs, what other data representation would you suggest?

            Comment

            • mb_q
              Senior Member
              • May 2017
              • 222

              #16
              Originally posted by Old Grouch View Post
              Given those apparent failings of JSON, and in the absence of OS APIs, what other data representation would you suggest?
              There are API, sys tree, syscalls & ioctls; most tools that you want to produce JSON have library versions that wrap them. Or you can use systemd stack and its DBUS controls. CLI tools should not enforce bad behaviour of using them from other programs.

              Comment

              • Old Grouch
                Senior Member
                • Apr 2020
                • 674

                #17
                Originally posted by mb_q View Post
                CLI tools should not enforce bad behaviour of using them from other programs.
                There are some people who would argue that that is quite a radical position, given the history of Unix, and the general habit of piping the output of one program into another for further processing.

                Personally, I think it would be useful to have tools be able to provide output in a well-formed canonical output for use in subsequent processing. Having pretty much agreed on the use of UTF-8, it would be nice if we could agree on a workable (serialised) data representation. JSON is not it, for the reasons others have pointed out.

                I'm certainly not against the use of 'other methods'. Being able to use generic standard APIs for CLI tools would be excellent.

                Comment

                • caligula
                  Senior Member
                  • Jan 2014
                  • 3313

                  #18
                  Originally posted by Old Grouch View Post
                  Personally, I think it would be useful to have tools be able to provide output in a well-formed canonical output for use in subsequent processing. Having pretty much agreed on the use of UTF-8, it would be nice if we could agree on a workable (serialised) data representation. JSON is not it, for the reasons others have pointed out.
                  Those claims are not true. If your data doesn't contain characters outside the ascii domain, json output can be pure 7-bit ascii. Also there are schema systems built on top of json. E.g. if you define the schema in high level typed language you can derive the serialization / deserialization process. Json serialization also doesn't require a full library. You can build a special purpose json serialization for a cmd line tool with just printf and some juggling.

                  Comment

                  • Old Grouch
                    Senior Member
                    • Apr 2020
                    • 674

                    #19
                    Originally posted by caligula View Post

                    Those claims are not true. If your data doesn't contain characters outside the ascii domain, json output can be pure 7-bit ascii. Also there are schema systems built on top of json. E.g. if you define the schema in high level typed language you can derive the serialization / deserialization process. Json serialization also doesn't require a full library. You can build a special purpose json serialization for a cmd line tool with just printf and some juggling.
                    Sigh. Fundamentally, all data can be represented by binary*. Everything else is just a level of abstraction built on top, with compromises built in to the abstraction method.
                    The point about UTF-8 is that no-one is seriously proposing a new byte-level representation of character sets. You can point to problems, but the world has voted that it is good enough, bar a few exceptions, such as personal names written in obsolete Chinese characters, or artificially created character sets for academic/hobby-project languages.

                    As for JSON, it does not have distinct types for integers and floating-point values, which is less than useful. Naturally, one can remedy this by designing a higher-level schema that uses workarounds to represent different types in a JSON serialized way, but you only need to look at the differing implementations, and questions on how to achieve what 'ought' to be trivial tasks to realize that JSON + schema is insufficiently standard, or understood, even now.

                    The fact that people find the need to write articles like this:

                    Investigate JSON's numeric constraints, language inconsistencies, and precision errors, emphasizing Go-JavaScript and IEEE 754 format


                    tells you that being forced to do type conversion to use a data interchange format is insane. A standard data-interchange format needs a far richer set of representations of numeric data than JSON currently has, together with a way of defining exceptions/additions where needed.

                    JSON is just one of many serialized data-interchange formats (Wikipedia: Comparison of data-serialization formats) - and the mere fact that there is a plethora tells you that JSON is not the agreed 'standard' method. It can be improved upon, and really ought to be.


                    *To be fair, one can argue that Alonzo Church's lambda-calculus represents numbers in unary. People in general tend to be more familiar with binary, and Turing machines (which, ironically, often represent numbers in unary in examples), rather than with lambda-calculus.
                    Last edited by Old Grouch; 11 November 2024, 10:45 AM. Reason: Edit to refer to Church (unary) numerals

                    Comment

                    • caligula
                      Senior Member
                      • Jan 2014
                      • 3313

                      #20
                      Originally posted by Old Grouch View Post

                      Sigh. Fundamentally, all data can be represented by binary. Everything else is just a level of abstraction built on top, with compromises built in to the abstraction method.
                      The point about UTF-8 is that no-one is seriously proposing a new byte-level representation of character sets. You can point to problems, but the world has voted that it is good enough, bar a few exceptions, such as personal names written in obsolete Chinese characters, or artificially created character sets for academic/hobby-project languages.
                      Ok, the thing I was referring to was 'it just doesn't support integers and binary data (you cannot even represent arbitrary Linux path in JSON due to the fact that JSON requires UTF-8 and Linux doesn't) and there is no standard schema system and often no schema whatsoever, so the conversion still has to be manual' (mb_q's comment). Some consider UTF-8 as a problem if they want arbitrary binary output. For instance file names can be non-conformant random binary sequences.

                      The fact that people find the need to write articles like this .. tells you that being forced to do type conversion to use a data interchange format is insane. A standard data-interchange format needs a far richer set of representations of numeric data than JSON currently has, together with a way of defining exceptions/additions where needed.
                      Aye, that's bad. However json is quite often used even if numbers need to be reinterpreted or rounded.

                      JSON is just one of many serialized data-interchange formats (Wikipedia: Comparison of data-serialization formats) - and the mere fact that there is a plethora tells you that JSON is not the agreed 'standard' method. It can be improved upon, and really ought to be.
                      Yes but that page doesn't cover all the issues. E.g. XML based ones are often considered quite bloated. I haven't studied this topic recently, but for example Amazon Ion looks quite interesting and if it's a superset of JSON, it could act as an upgrade path now that json has become pretty ubiquitous.

                      Comment

                      Working...
                      X