I am playing around with btrfs at the moment. What I haven't understood yet how to deal with degraded arrays:
First I tried to install PBS in a VM, rootfs on ext4, so nothing special. Installed kernel 5.11 and btrfs-progs from backports.
Made three disks and put them into a raid1-array with btrfs (I really had to test this raid1 with uneven disks, nice!).
Detached on of the three disks while system is running and nothing really happens.
Opposite to other implementations btrfs doesn't seem to have any notifications built in, so I have to monitor if a disk is failing/an arrays is degraded?
When trying to boot the system hangs because it can't mount the btrfs-volume. Even if I put degraded into fstab and /dev/sdb instead of the UUID.
On the rescue-shell I can login and mount just fine with /dev/sdb and degraded-option.
What I didn't manage is to put the degraded as rootflag in grub, which might be what is needed to boot the system in degraded state.
After this I installed pve-7-beta in a VM with three disks as btfrs raid1, directly configured in the installer.
It seems that the installer just does a root-volume for the system, no subvolume (or only the root-subvolume? It is still very new to me, I don't know if I have all the naming correct).
When detaching a disk here I am going to the initramfs-shell and don't really know how to proceed further at the moment.
If I want my systems to boot in degraded state, like other implementations do it, should there be a permanent entry to boot in degraded state?
What have others experienced with btrfs so far?
First I tried to install PBS in a VM, rootfs on ext4, so nothing special. Installed kernel 5.11 and btrfs-progs from backports.
Made three disks and put them into a raid1-array with btrfs (I really had to test this raid1 with uneven disks, nice!).
Detached on of the three disks while system is running and nothing really happens.
Opposite to other implementations btrfs doesn't seem to have any notifications built in, so I have to monitor if a disk is failing/an arrays is degraded?
When trying to boot the system hangs because it can't mount the btrfs-volume. Even if I put degraded into fstab and /dev/sdb instead of the UUID.
On the rescue-shell I can login and mount just fine with /dev/sdb and degraded-option.
What I didn't manage is to put the degraded as rootflag in grub, which might be what is needed to boot the system in degraded state.
After this I installed pve-7-beta in a VM with three disks as btfrs raid1, directly configured in the installer.
It seems that the installer just does a root-volume for the system, no subvolume (or only the root-subvolume? It is still very new to me, I don't know if I have all the naming correct).
When detaching a disk here I am going to the initramfs-shell and don't really know how to proceed further at the moment.
If I want my systems to boot in degraded state, like other implementations do it, should there be a permanent entry to boot in degraded state?
What have others experienced with btrfs so far?