Hi
We're testing PVE and Ceph, trying out failure conditions. We're simulating a failed cluster node where we want to reinstall and bring up the existing ceph OSDs. There are a few forum threads and notes about trying to reinstall a node, but nothing clear or complete that we've been able to find. We want to simulate a failed boot device, or installing a fresh PVE node in a different chassis and moving the OSD drives to it.
Is there a documented process for bringing ceph OSDs back online after reinstalling a node? Details of what to restore from backup or what to reinitialise? Our attempts so far have resulted in a PVE node back in the cluster, the ceph monitor and manager running, and the OSDs visible in the UI but they will not start.
Thanks
David
...
We're testing PVE and Ceph, trying out failure conditions. We're simulating a failed cluster node where we want to reinstall and bring up the existing ceph OSDs. There are a few forum threads and notes about trying to reinstall a node, but nothing clear or complete that we've been able to find. We want to simulate a failed boot device, or installing a fresh PVE node in a different chassis and moving the OSD drives to it.
Is there a documented process for bringing ceph OSDs back online after reinstalling a node? Details of what to restore from backup or what to reinitialise? Our attempts so far have resulted in a PVE node back in the cluster, the ceph monitor and manager running, and the OSDs visible in the UI but they will not start.
Thanks
David
...