Hi folks
We have a cluster consisting of 14 nodes.
For some reason, I had to remove 3 of these nodes, reinstall Proxmox and bring them back into the cluster. It happened over time for different reasons.
Now the issue is, none of my nodes can SSH to the nodes which have been re-installed.
They find a duplicate fingerprint in their know_hosts
I have run "pvecm updatecerts" on the reinstalled nodes but no changes.
The only workaround is to manually remove those lines from other node's know_hosts file. But if any one of them get rebooted, it will be the same again.
Another issue is, whenever I add a new node, my other nodes will see that as offline and cannot migrate any vm to it., and I have to restart corosync and pve-cluster services, which scares me a lot, because, during the restart, that node appears as offline in the GUI, I don't if VMs will experience any interruption or not.
So would you help me to fix the cluster state again?
We have a cluster consisting of 14 nodes.
For some reason, I had to remove 3 of these nodes, reinstall Proxmox and bring them back into the cluster. It happened over time for different reasons.
Now the issue is, none of my nodes can SSH to the nodes which have been re-installed.
They find a duplicate fingerprint in their know_hosts
I have run "pvecm updatecerts" on the reinstalled nodes but no changes.
The only workaround is to manually remove those lines from other node's know_hosts file. But if any one of them get rebooted, it will be the same again.
Another issue is, whenever I add a new node, my other nodes will see that as offline and cannot migrate any vm to it., and I have to restart corosync and pve-cluster services, which scares me a lot, because, during the restart, that node appears as offline in the GUI, I don't if VMs will experience any interruption or not.
So would you help me to fix the cluster state again?