I would like to find a way to handle SSH keys for all my KVM machines with a script, ideally with something that can be integrated into a Gitlab CI pipeline. I configure all my ssh keys through cloud-init and the Proxmox user interface currently. An ideal solution would also use cloud-init.
I am looking at Ansible and found the following: https://docs.ansible.com/ansible/latest/collections/community/general/proxmox_kvm_module.html
My issue is that it seems to only work to create new machines, but not to update existing ones. Can someone lead me to a way on configuring the ssh keys of my KVM hosts with ansible? Theoretically I could use an ansible script that updates host keys by logging into the machines, but that would mean the ansible host would need access to the machine as well. I would prefer a solution that adds the keys through proxmox cloud-init.
Has anyone found a solution to that?
I am looking at Ansible and found the following: https://docs.ansible.com/ansible/latest/collections/community/general/proxmox_kvm_module.html
My issue is that it seems to only work to create new machines, but not to update existing ones. Can someone lead me to a way on configuring the ssh keys of my KVM hosts with ansible? Theoretically I could use an ansible script that updates host keys by logging into the machines, but that would mean the ansible host would need access to the machine as well. I would prefer a solution that adds the keys through proxmox cloud-init.
Has anyone found a solution to that?