Managing Multiple Hosts

Brian_

New Member
Mar 11, 2016
2
0
1
52
In time, I will have possibly 50 Proxmox hosts, each in their own building spread throughout the country. They will be on the same network and pingable. When the servers are setup, they will all have the same hardware, and the same vms running on them, as each location has the same needs. After deployment, if a new server is needed at each site, I know I can copy the conf file, and the disk file to each server. Using scp for one or two servers would be find, but not 50. What is the best way to duplicate a server to 50 sites? Will setting up all the hosts on a cluster allow me to do this? Simultaneously? Or just one at a time?
 
I have not done this before, but I think you can build something with multicast: you're streaming the data not to one host (unicast) but to multiple nodes. Similar as the internal Proxmox cluster communication.

But what is the problem with writing a script that does all the work for you and only start it for a new server? You do not have to use scp, e.g. if you're on zfs on all boxes, you can send/receive your data around or use rsync to copy the rest off to new hosts.
 
My original thought was Puppet, but I didn't really want to install Puppet on the Proxmox host itself. My thought was keeping it clean would be much better in case I need support from Proxmox or the community. I was thinking that if I had a VM on each system that had a clean OS, and the Puppet client, it could be used to pull files from a Puppet master and save those files to a mount point that is on the Proxmox host.

As far as scripting goes (without Puppet), I know I can write a script on my computer to copy files to each host which would look something like:

Code:
scp /images/105/disk1.qcow user@host1:/mnt/images/105/
scp /images/105/disk1.qcow user@host2:/mnt/images/105/

If I did this, then it would be sequential.

I wrote a script that would copy a script to each machine the copied script could be the same on each machine.

Code:
scp user@masterhost:/mnt/images/105/disk1.qcow /images/105/

If I did that, is there a way to easily run that script on each machine at the same time? Could I write a script that would do something similar but write it to a cron job?

I am not sure what LnxBil mean in regard to multicasting. I know what multicasting is, but how would you set that up?

A quick Google search discussed using BitTorrent to do it. This sounds like a good idea, but I am not sure how you would shutdown the BitTorrent client when everyone had the file, or how you would even know they have it. I know there is also BitTorrent Sync, which I have never used. Anyone have thoughts on it, or am I better off with scripting or Puppet?
 
So I am not an expert, but one possibility does occur to me.

If you write a general script that handles configuration management ( I would prefer the Ansible way to collect answers, then decide on actions) and distribute files using ssh/scp/rsync.

You do not then have to perform a sequential loop over each of your 50 hosts.
You can batch up the requests using gnu parallel.

https://en.wikipedia.org/wiki/GNU_parallel

That way you get configurable concurrency and can have separate file output of stdout+stderr for _each_ transaction.

Just my 2c ;-)
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!