How to update public ssh keys for proxmox nodes

mikeleffring

New Member
Jan 2, 2024
7
0
1
Hello all. 98% out of the weeds here but can't cross the finish line. I have a 7 node cluster on proxmox ve and I had to do a ton of hardware upgrades (memory, os and ceph data ssd, nic card change) as well as configure redundant networking as I take this cluster into use soon. In the process I decided it would be best to get a clean slate, so I researched, reach a bunch of proxmox forums posts and got all 7 reloaded, configured with lacp nics, idential drive ssd ceph storage, vlans, etc... its great! The last leg of this is that the "cluster" is still around from the old days. Which is fine, I haven't had issues, everything is healthy and I did a ton of cleanup.

The issue is that I try to do anything with public ssh keys (like live migrations) and they failed due to the public key changing. I know that this is not proxmox specific and its just a thing in general, as when I provisioned the new nodes obviously, they keys changed. I really understand debian pretty well at its roots but I've never been to strong when it comes to ssh/rsa keys. I was hoping to get some community guidance for how to via the gui or cli:

1. Wipe out all 7 nodes records of all other nodes public keys.
2. Add the new public keys for each node onto every node in the cluster.
3. Any proxmox specific steps or tips or tricks when it comes to ssh/rsa keys.

Also, if anyone just wants to lecture me on ssh public keys in general I will listen as again I feel this is one of my weakest areas in linux, when it shouldn't be as I'm told this is to simple. Thanks in advance!
 
Sorry for leaving out the actual erros but I also assumed they were straight forward :)

2024-01-30 07:35:04 # /usr/bin/ssh -e none -o 'BatchMode=yes' -o 'HostKeyAlias=pve-a' root@X.X.X.X /bin/true
2024-01-30 07:35:04 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
2024-01-30 07:35:04 @ WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED! @
2024-01-30 07:35:04 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
2024-01-30 07:35:04 IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY!
2024-01-30 07:35:04 Someone could be eavesdropping on you right now (man-in-the-middle attack)!
2024-01-30 07:35:04 It is also possible that a host key has just been changed.
2024-01-30 07:35:04 The fingerprint for the RSA key sent by the remote host is
2024-01-30 07:35:04 SHA XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
2024-01-30 07:35:04 Please contact your system administrator.
2024-01-30 07:35:04 Add correct host key in /root/.ssh/known_hosts to get rid of this message.
2024-01-30 07:35:04 Offending RSA key in /etc/ssh/ssh_known_hosts:1
2024-01-30 07:35:04 remove with:
2024-01-30 07:35:04 ssh-keygen -f "/etc/ssh/ssh_known_hosts" -R "pve"
2024-01-30 07:35:04 Host key for pve has changed and you have requested strict checking.
2024-01-30 07:35:04 Host key verification failed.
2024-01-30 07:35:04 ERROR: migration aborted (duration 00:00:01): Can't connect to destination address using public key
TASK ERROR: migration aborted

I removed the old key with the line above that says remove with: and then tried to migrate again but got this:

2024-01-30 08:27:53 # /usr/bin/ssh -e none -o 'BatchMode=yes' -o 'HostKeyAlias=pve' root@X.X.X.X /bin/true
2024-01-30 08:27:53 Host key verification failed.
2024-01-30 08:27:53 ERROR: migration aborted (duration 00:00:00): Can't connect to destination address using public key
TASK ERROR: migration aborted

Thus the desire to ask for help and a clean process from someone more experienced with proxmox and debian ssh public keys so I don't half way fix it or do a wrong clean up. Hopefully learn more about it as well.
 
the easiest way is probably to "mv /etc/pve/priv/known_hosts /etc/pve/priv/known_hosts.bak" (clear the existing host keys) and then run "pvecm updatecerts" on each node.
 
So it worked on 1 of the 7 nodes. The mv command worked, I checked after on all 7 and the output of pvecm updatecerts was the same on all 7 as well.

(re)generate node files
merge authorized SSH keys and known hosts

But yet I tried some migrations and I can only migrate off of node a. That seems to be the only one it worked on. All the others still say:

2024-01-30 09:09:50 # /usr/bin/ssh -e none -o 'BatchMode=yes' -o 'HostKeyAlias=pve' root@X.X.X.X /bin/true
2024-01-30 09:09:50 Host key verification failed.
2024-01-30 09:09:50 ERROR: migration aborted (duration 00:00:00): Can't connect to destination address using public key
TASK ERROR: migration aborted

Which I guess is progress as before it thought they were attacks and now it just doesn't have the key. Is rebooting the nodes necessary?
 
Last edited:
@manzke89 Based on your link to the other post you say to run these:

ssh-keygen -f "/etc/ssh/ssh_known_hosts" -R "node1"

/usr/bin/ssh -e none -o 'HostKeyAlias=node1' root@192.168.20.117 /bin/true

In my case, with all but 1 node broken, would I be:

Running this command 1once on each node:

ssh-keygen -f "/etc/ssh/ssh_known_hosts" -R "node1"

Then running this command once on each node per host, so on node1 out of 7 nodes it will run 6 times with the 6 different ip and node names:

Run on node 1:
/usr/bin/ssh -e none -o 'HostKeyAlias=node2' root@X.X.X.2 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node3' root@X.X.X.3 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node4' root@X.X.X.4 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node5' root@X.X.X.5 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node6' root@X.X.X.6 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node7' root@X.X.X.7 /bin/true

Run on node 2:
/usr/bin/ssh -e none -o 'HostKeyAlias=node1' root@X.X.X.1 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node3' root@X.X.X.3 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node4' root@X.X.X.4 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node5' root@X.X.X.5 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node6' root@X.X.X.6 /bin/true
/usr/bin/ssh -e none -o 'HostKeyAlias=node7' root@X.X.X.7 /bin/true

So on...

Then this once on each node:

"mv /etc/pve/priv/known_hosts /etc/pve/priv/known_hosts.bak"

Then this once on each host:

pvecm updatecerts

Maybe?
 
Last edited:

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!