Hello,
i have a cluster with 3 Nodes, now each node on the last PVE version.
One was newly installed and integrated,
one was updated from PVE 7.x,
one had been reinstalled from a old version with the same Name and IP.
Now, when doing a migration, it mostly brakes with a ssh error. Only to the newly installed Node it is working from the other Nodes, but not the other way or from any of the other two Node.
The migration ssh error is:
i have a cluster with 3 Nodes, now each node on the last PVE version.
One was newly installed and integrated,
one was updated from PVE 7.x,
one had been reinstalled from a old version with the same Name and IP.
Now, when doing a migration, it mostly brakes with a ssh error. Only to the newly installed Node it is working from the other Nodes, but not the other way or from any of the other two Node.
The migration ssh error is:
Bash:
2024-09-14 10:33:59 # /usr/bin/ssh -e none -o 'BatchMode=yes' -o 'HostKeyAlias=node1' -o 'UserKnownHostsFile=/etc/pve/nodes/node1/ssh_known_hosts' -o 'GlobalKnownHostsFile=none' root@10.66.5.16 /bin/true
2024-09-14 10:33:59 ssh: connect to host 10.66.5.16 port 22: Connection timed out
2024-09-14 10:33:59 ERROR: migration aborted (duration 00:02:15): Can't connect to destination address using public key
TASK ERROR: migration aborted