So i have been messing with SDN trying to have different vms on different nodes communicate with each other. After a bunch of failures I got it. I basically wanted a locked down subnet internal to proxmox. This is my last issue I need to solve. Both my nodes are dual niced. One for management vmbr0 and one for vm comms (2.5GBs) vmbr1. No matter what Ips I use as peers or if I use a OSPF fabric It still communicates over vmbr0. Any suggestions? Comms work between vms i just want the gateway of those comms to be via vmbr1 on both nodes (which the peers are set to in sdn).
Bash:
cat /etc/network/interfaces.d/sdn
#version:30
auto noint
iface noint
bridge_ports vxlan_noint
bridge_stp off
bridge_fd 0
mtu 1450
auto vxlan_noint
iface vxlan_noint
vxlan-id 2000
vxlan_remoteip 192.168.9.49
mtu 1450
cat /etc/pve/sdn/*.cfg
evpn: evpnctrl
asn 65000
peers 192.168.9.49, 192.168.9.50
subnet: vxnoint-10.50.0.1-24
vnet noint
dhcp-range start-address=10.50.0.50,end-address=10.50.0.100
gateway 10.50.0.1
vnet: noint
zone vxnoint
tag 2000
vxlan: vxnoint
ipam pve
peers 192.168.9.50, 192.168.9.49