Announcement

Collapse
No announcement yet.

Linux To Try To Opportunistically Initialize /dev/urandom

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Linux To Try To Opportunistically Initialize /dev/urandom

    Phoronix: Linux 5.19 To Try To Opportunistically Initialize /dev/urandom

    Linux 5.18 is bringing many random/RNG improvements thanks to the work of kernel developer Jason Donenfeld. One of the changes though that had to be backed out during the merge window was trying to get /dev/random and /dev/urandom to behave exactly the same. While reverted for now with the 5.18 code, Donenfeld has prepared a change that should get it into good shape for major architectures with the next kernel cycle...

    https://www.phoronix.com/scan.php?pa...nistic-urandom

  • #2
    I read that as optimistically. Like, is the kernel gonna put on 37 pieces of flair before asking for more entropy?

    Also, typo: /dev/urandom wokr had to

    Comment


    • #3
      Originally posted by skeevy420 View Post
      Like, is the kernel gonna put on 37 pieces of flair before asking for more entropy?
      The minimum to boot is 15 pieces of flair, but look... Brian's kernel over there has 37 pieces of flair and a terrific smile. People can get a little bit of entropy anywhere these days, but the reason they come into Phoronixkies is for the attitude. And that's what our flair is about. It's about entropic fun. Now if you feel the bare minimum of 15 pieces of flair is enough then okay, but you do what the RNG subsystem to express itself, don't you?

      Comment


      • #4
        Originally posted by zx2c4 View Post
        The minimum to boot is 15 pieces of flair, but look... Brian's kernel over there has 37 pieces of flair and a terrific smile. People can get a little bit of entropy anywhere these days, but the reason they come into Phoronixkies is for the attitude. And that's what our flair is about. It's about entropic fun. Now if you feel the bare minimum of 15 pieces of flair is enough then okay, but you do what the RNG subsystem to express itself, don't you?
        Lol.

        Does that mean that failing to properly seed the RNG subsystem will get you a:
        "Damn it feels good to be a gangsta" from urandom?

        Comment


        • #5
          Originally posted by milkylainen View Post

          Lol.

          Does that mean that failing to properly seed the RNG subsystem will get you a:
          "Damn it feels good to be a gangsta" from urandom?
          It'll just put a decimal in the wrong place or something. Shit. It's always messing up some mundane detail.

          Comment


          • #6
            Originally posted by skeevy420 View Post

            It'll just put a decimal in the wrong place or something. Shit. It's always messing up some mundane detail.
            Oh! Well. This is not a mundane detail Michael!

            Comment


            • #7
              Originally posted by skeevy420 View Post
              I read that as optimistically. Like, is the kernel gonna put on 37 pieces of flair before asking for more entropy?

              Also, typo: /dev/urandom wokr had to
              urandom has just gone woke. First woke, then wok(e)r.

              Comment


              • #8
                Hello and Aloha,

                Convinced that both /dev/{u, }random performed the same task, as non-blocking character devices, even the throughput was exceptionally high, with some rise and fall fluctuations I achieved 110MB/s even on a system that has no cryptographic routines. I have some thoughts about that and the current state-of-the-art of counting system 'jitter' and other 'entropy' harvesting techniques that I presumed were in a (re)seeding mixing scheme of 'entropy' mixing with high quality cryptographic re-hashing as output of that.

                Introducing /dev/entropy and 2nd law order rising Statistical Entropy, QKD preparedness
                I only recently wrote an article about having a non-blocking interactive character device /dev/entropy to have an accounting [re]-seeding facility that can be configured for different re-seeding strategies to serve the 'entropy' from isolated 'pools of entropy' (short lived TTL ring buffers) to generators , the idea was to de-correlate these and other sources as much as possible as to further harden the system. Such a facility would aid development and upgrade of current generators, but I wanted to do more with the device, because I expect that NIST will have new conformance recommendations ready for the existing 'binary alphabet' bound methods.

                Envisioned the /dev/entropy device to have an interactive stream designer, to enable (new, existing) generators to have delta-streams in secured output, this could be a significant gain for QKD preparedness and development, as delta-streams may serve 'symbol' space expansion and have 'stream' symbols to control multiplexing (by wavelength division) on the transport layer.

                I also worked on a method and model to have these definitions and configurations encapsulated as highly secured system assets oof which selected can be made persistant in a secure function domain. Inspired by Covid-19 with a hypothese how Covid has this 'offset' memory. Silly eh

                Well, I hope to hear from you what you think,
                thank you for your time and work
                regards, Edwin

                Comment

                Working...
                X