Configuring SSH keys for many machines

nmue

Member
Nov 30, 2009
8
0
21
I would like to find a way to handle SSH keys for all my KVM machines with a script, ideally with something that can be integrated into a Gitlab CI pipeline. I configure all my ssh keys through cloud-init and the Proxmox user interface currently. An ideal solution would also use cloud-init.

I am looking at Ansible and found the following: https://docs.ansible.com/ansible/latest/collections/community/general/proxmox_kvm_module.html

My issue is that it seems to only work to create new machines, but not to update existing ones. Can someone lead me to a way on configuring the ssh keys of my KVM hosts with ansible? Theoretically I could use an ansible script that updates host keys by logging into the machines, but that would mean the ansible host would need access to the machine as well. I would prefer a solution that adds the keys through proxmox cloud-init.

Has anyone found a solution to that?
 
AFAIU cloudinit is only run after the first boot? So you would then need to use another config mgmt tool (ansible, salt, puppet, ...) after that unless I am wrong and someone has a better idea.
 
AFAIU cloudinit is only run after the first boot? So you would then need to use another config mgmt tool (ansible, salt, puppet, ...) after that unless I am wrong and someone has a better idea.

Actually the host public key files can be changed at any time and will be in place after a full stop/start of the machine. This is how I do it currently over the Proxmox UI, but would like to have a more elegant solution.