PVE firewall blocks cluster communication

dqq

Well-Known Member
Jan 30, 2020
34
1
48
55
Hi,

I have 2 node datacenter, nodes have public IPs and are also connected with each other through VRACK.

Datacenter:
Node 1: PRIVATE IP: 192.168.0.2, PUBLIC IP: 1.2.3.4
Node 2: IP 192.168.0.3, , PUBLIC IP: 2.3.4.5

When I enable firewall cluster wide, following things happen:
- I cannot use GUI of 2nd node through 1st server - I get "communication failure"

I have added firewall rules accepting IN/OUT traffic in both machines for 182.168.0.0/28 subnet, no success.

One thing is that, when I type pve-firewall localnet in my ssh terminal of both nodes I get:
Code:
local hostname: hostname_1
local IP address: 1.2.3.4
network auto detect: PUBLIC NETWORK/24
using user defined local_network: 192.168.0.0

accepting corosync traffic from/to:
-  hostname_2: 192.168.0.3 (link: 0)

So machines does detect public IP address as an local one. How can I fix this to enable .


Additionally, quorum responds as "OK" despite connection problems between nodes.
Situations seems to happen, when I set Firewall INPUT policy to drop on cluster.

How can I fix this?
 

Attachments

  • conn.png
    conn.png
    21.5 KB · Views: 9
Edit:

It looks like machines are trying to communicate with each other through public IP (from Firewall Logs). Shouldn't the communicate using VRACK? How can I fix this?
 
Hi,
check your /etc/hosts. Does the nodename resolve to the private IP?
 
It was public IP, I have changed those, but no effect - is reboot needed?
 
No, a reboot should not be necessary. You probably need to ssh from one node to the other and vice versa in order accept the fingerprints.