Announcement

Collapse
No announcement yet.

Bcachefs Under Review With All Known Blockers Resolved

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #41
    Originally posted by geearf View Post
    (I wish there was a background compression option, meaning when I write something I should write it at full speed so low or no compression, and later, with a low priority, the blocks would be rewritten with maximal compression. Though I know of no FS doing that so it's nothing specific to btrfs.)
    Incidentally, this is known as "tiered storage" and is one of the main planned/designed features of bcachefs.

    (I'm not sure if current status of tiering support in bcachefs specifically includes recompression, but I'm sure that it either does or is planned to do.)

    Originally posted by geearf View Post
    Having deduplication not fight defragmentation would be nice, but I don't know if that is actually a reasonable request either.
    Yes, it is a reasonable request, it's just a pretty hard problem to solve with available manpower. The last guy who tried to do this (with "this" being snapshot-aware defrag, to be exact) ended up breaking stuff and the feature got reverted.
    Last edited by intelfx; 21 December 2020, 04:09 AM.

    Comment


    • #42
      Originally posted by intelfx View Post

      Incidentally, this is known as "tiered storage" and is one of the main planned/designed features of bcachefs.
      Hmmm, I thought tiered storage was the ability to store the data in various drives, with more levels than what bcache offers.

      Originally posted by intelfx View Post
      (I'm not sure if current status of tiering support in bcachefs specifically includes recompression, but I'm sure that it either does or is planned to do.)
      Kent told me he'd be okay with having that compression feature, no promise on when of course, but that was enough to get me excited.
      I don't believe it was planned before I asked, but I'm happy if I'm wrong here.

      Originally posted by intelfx View Post
      Yes, it is a reasonable request, it's just a pretty hard problem to solve with available manpower. The last guy who tried to do this (with "this" being snapshot-aware defrag, to be exact) ended up breaking stuff and the feature got reverted.
      I see, that's too bad...
      Do you expect this to be better with bcachefs?


      A bit unrelated, but since you seem to know about that kind of stuff, would there a benefit in using masks in deduplication? Meaning not just looking at deduplicating same blocks, but also blocks that could be transformed into being the same with some simple bit masking.
      Thanks!

      Comment


      • #43
        Originally posted by geearf View Post
        Hmmm, I thought tiered storage was the ability to store the data in various drives, with more levels than what bcache offers.


        Kent told me he'd be okay with having that compression feature, no promise on when of course, but that was enough to get me excited.
        I don't believe it was planned before I asked, but I'm happy if I'm wrong here.
        Hm. Then it may be wishful thinking on my part. I definitely remember that bcachefs roadmap included different redundancy for tiers, and probably extended that to compression in my mind.


        Originally posted by geearf View Post
        I see, that's too bad...
        Do you expect this to be better with bcachefs?
        No idea. Kent wrote that he had some sort of novel idea for snapshots, but they are not there yet.

        In fact, everything is perfectly doable. It just needs someone motivated enough to do it and see the task to the end.

        Originally posted by geearf View Post
        A bit unrelated, but since you seem to know about that kind of stuff, would there a benefit in using masks in deduplication? Meaning not just looking at deduplicating same blocks, but also blocks that could be transformed into being the same with some simple bit masking.
        Thanks!
        Eh... I don't think it makes a lot of sense. What files do you think would be good candidates for such mask-based deduplication?

        Comment


        • #44
          Originally posted by intelfx View Post
          Eh... I don't think it makes a lot of sense. What files do you think would be good candidates for such mask-based deduplication?
          No idea

          Comment

          Working...
          X