Comunication Failure After Change Cluster Network IPs

Maël Jeuffrard

New Member
Mar 26, 2018
5
0
1
Hi,

First: I'am so sorry, my english is too bad and it's not easy for me to describe my problem.

I've 1 cluster Proxmox-ve with 4 nodes.
I've dedicated network for the cluster communication on vmbr0 & Multicast is activated.
Recently i've change the IPs of vmbr0 on all nodes. I also edit hosts file and corosync.conf.
Quorum is OK and migrations of VMs works fine.
The problem is, i can't see some options of one vm if she on another node.
It works fine only if i'am connect to the node4. If i'am connect with another (Node[1-3] i view only options of them nodes.

The page going to the timeout with "proxmox communication failure (0)"

Thanks a lot for your help, and of course if you need more information, i will send them.

Regards,
Maël
 
Hi,

please check if the ssh connection work with the following command.

Code:
ssh -o "HostKeyAlias=<remote nodename>" root@<remote node ip>
 
Can you please send the output of the following commands.

Code:
cat /etc/hosts
cat /etc/hostname
cat /etc/network/interfaces
cat /etc/pve/corosync.conf
cat /etc/pve/.members
 
Your config looks ok.
Try to restart the following services on all nodes.

Code:
systemctl restart corosync.service
systemctl restart pve-cluster.service
systemctl restart pvedaemon.service
systemctl restart pveproxy.service
systemctl restart pvestatd.service
 
No this are only the management tools.
In worse case you can't manage them.
If you have HA enabled please deactivate it.