ssh key problem when adding a node to a cluster

GT1

Active Member
Aug 10, 2011
48
0
26
okay, so i have a testcluster with 3 nodes and a 4th separate machine. latest 2.0 beta on each which are identical in terms of hardware.

when i add the 4th with pvecm add ipaddress.of.clusternode.1 (being logged in on node 4), the public key of node 4 is added to node 1, but not to the remaining 2. because of this, i cant migrate a vm from node2 (or 3) to node 4 as the gui throws an error being unable to log onto node 4.
once i log onto node 2 (& 3) through ssh, issue command ssh root@ipaddress.of.clusternode.4 , answer yes to the question if i want to add/trust the key of node 4, everything works fine.

did i do something wrong somewhere or is that a bug ?

thank you
 
the public key of node 4 is added to node 1, but not to the remaining 2.

Strange. /root/.ssh/authorized_keys should be a symlink to /etc/pve/priv/authorized_keys, so all should see the same content.

What is the output of

# ls -l /root/.ssh/authorized_keys

and

# ls -l /etc/ssh/ssh_known_hosts
 
Last edited: