Setting up a cluster

zordio

New Member
Jun 22, 2009
21
0
1
I have two servers which I'm trying to set up as a cluster. server1 is pve-manager/1.3/4023. server2 is pve-manager/1.4/4403. I have created the master on server1 (which contains all of the VM's), and attached server2 as a node. However, 'pveca -l' run on server1 gives the following for server2: 'ERROR: 400 Invalid SOAPAction'. server2 gives the following for server1: 'ERROR: 500 Server closed connection without sending any data back.

I also have the following in /var/log/auth.log on server1: 'sshd[884]: Received request to connect to host localhost port 83, but the request was denied.' I had also previously gotten that on server2. It seemed to be fixed by adding 'ALL:localhost' on server2, but the same thing on server1 had no effect. Neither one has any uncommented lines in hosts.deny.

Once I get the cluster set up, can I change which ethernet port is used for the cluster by editing /etc/pve/cluster.cfg, to change it to the private network?

If I can't get the cluster working easily, what would I need to do in order to have the VM information on both servers, and to move VM's from one server to the other similar to live migration?
 
Hi,
for a cluster you need the same version on both nodes (1.3 or 1.4).

The configs (and the diskfiles) are also in a cluster only at one node. But you can copy the config yourself (for backup reasons) and use shared storage (nfs, iscsi, fc-faids) for the diskfiles (you need pve 1.4).

Udo
 
That's a shame. I was hoping to be able to migrate the VM's to the newer machine, and then upgrade the older ones.

Fortunately, manual migration looks easier than I thought. I stumbled across vzmigrate (and friend, qmigrate).