Hi,
We want to build a 5 node PX cluster with Ceph storage. For this we are considering the SuperServer 1029U-TN10RT from SuperMicro equipped with:
- 2 x Intel 16 core CPU: SKL-SP 6130 16C/32T 2.1G 22M 10.4GT UPI
- 128 Gb RAM
- 4 x 1TB NVMe Intel DC P4510 PCIe 3.0 3D TLC 2.5"
- 2 x Mellanox ConnectX®-4 EN dual port 100Gbps controller
- 2 x Intel® X540 Dual Port 10GBase-T
All of these will be connected to a 25 Gb ports switches using 100Gb to 4x25Gb splitters. The bandwith will be as:
10 Gb - Public Internet
10 Gb - Migration VLAN, intranet, Corosync ring 3
75 Gb - Ceph private VLAN
5 Gb - Corosync ring 1
75 Gb - Ceph public VLAN
5 Gb - Management, Corosync ring 2
Do you have good experience with the drivers for Mellanox ConnectX®-4 EN dual port 100Gbps controller and Intel® X540 Dual Port 10GBase-T ?
Do you think this setup is ok and reliable for a hyper-converged cluster?
Do you have any other suggestions?
Thank you,
Rares
We want to build a 5 node PX cluster with Ceph storage. For this we are considering the SuperServer 1029U-TN10RT from SuperMicro equipped with:
- 2 x Intel 16 core CPU: SKL-SP 6130 16C/32T 2.1G 22M 10.4GT UPI
- 128 Gb RAM
- 4 x 1TB NVMe Intel DC P4510 PCIe 3.0 3D TLC 2.5"
- 2 x Mellanox ConnectX®-4 EN dual port 100Gbps controller
- 2 x Intel® X540 Dual Port 10GBase-T
All of these will be connected to a 25 Gb ports switches using 100Gb to 4x25Gb splitters. The bandwith will be as:
10 Gb - Public Internet
10 Gb - Migration VLAN, intranet, Corosync ring 3
75 Gb - Ceph private VLAN
5 Gb - Corosync ring 1
75 Gb - Ceph public VLAN
5 Gb - Management, Corosync ring 2
Do you have good experience with the drivers for Mellanox ConnectX®-4 EN dual port 100Gbps controller and Intel® X540 Dual Port 10GBase-T ?
Do you think this setup is ok and reliable for a hyper-converged cluster?
Do you have any other suggestions?
Thank you,
Rares