I installed grub-efi-amd64 on a test machine made some changes in /etc/defaults/grub.cfg and reboot the system. System is started normally without any issues. If installing grub-efi-amd64 the package grub-pc is removed. System is a Dell R740XD Intel Xeon Gold based system.
@LnxBil: yxou are right it depends on the container and I guess it's the sqlite db, cause I get a lot of knex timeouts in both setups after a while (db is growing, we monitored more then 500 instances). Something happend within the connection pooling of the node.js server application. Connection...
I checked the system load and the disk is, this is not the problem bare metal is not heavy utilized round 5-10 %.
And yes sometimes it occurs off backups are running. The docker container I ran was uptime kuma. On lxc that is placed on the same bare metal does not have time outs in kuna. I know...
Hello,
The vm has twice power, double cpus and double amount of ram and I get the timeouts. The lxc container wirft docker works as expected without timeouts. So I believe to upgrade lxc container‘s resources will have no effect.
I checked the stats and send it tomorr.
Hi,
I add a bridge to use it as switch. I use Cisco SFP28 cables with the mikrotic and the mellanox cards . So I get instantly 25GBit links also on breakout cables.
Hello,
we ran docker on LXC container and vm on a proxmox 8.x.x three node cluster with NVME ceph storage (24 NVMEs) on Dell R740XD servers. Docker runs on Debian booworkm latest version. Hypervisor nesting is activated for LXC and VM. We put our monitoring in a docker container in LXC and VM...
Hi Falk,
I got everything together. I connect the 100GBe breakout with one Mellanoc ConnectX-4 LX Microtik and Mellanox show link established (Mellanox Link LED lights green) but the activity LED of the mellanox card blinked cyclic green and I get not traffic over the link. I see the card with...
Perfect, thanlks a lot, so I'll go with the Mikrotik CRS510-8XS-2XQ-IN and the Mellanox Connectx-4 SFP28 25GBe Cards to reduce power consumption. So I will use 2 passive DAC 100GBe QSFP28 to 4x 25GBe SFP28 to connect the servers and 4 1GBe Transceivers to connect legacy stuff on SFP28 Ports. I...
Yes of course routerOs looks like a bit different from calssic web uis of switches, but it is ok to manage things and cli looks nice. So I think I would go with the MikroTik at all. The only thing is DAC breakthrough port configuration, for example bonding(LCAP). So if you please be so kind to...
So thanks for the hint with routerOS and VM, I bought Mellanox Connectx-4 MXC422A cards (dual port SFP+ 25Gbe) as daughterboards for the Dell Servers. It was a decision of price 15.- EUR/Card is better than 69.- EUR for one broadcom.
So now I check the MikroTik RouterOS 7.1.2. fs.com is ou of...
Yes we have 3 Dell R740XD srever. Ceph is connected via 100Gbe Mellanox Connectx-6 dualport in routed network without a switch, that's fine. All 3 nodes have quad port Intel X550 10Gbe nics as daughterboard. 1 Bond for WAN (plugged into the switch) and one router network for migration.
We have a...
Can you confirm that a Broadcom with 57414 Chipset works well? So we decided to use SFP28 with a new Mircotik Switch instead of using SFP+. Otherwise I use the Mellanox Connectx-5 CX512F Dual Port SFP28. Background is that our Netgear switch based on 10GBe Base-T consumes to much power (round...
Hello,
we plan to upgrade the network from 10GBe Base-T to 10/25GBe SFP+/SFP28. So in the past I still prefer intel cause the are stable an run out of the box. But the Dell R740XD Server offers as QuadPort daughter board only Broadcom, QLogic and Intel, I read there are compatibility issues...
Hello,
just use the same config as described here https://pve.proxmox.com/wiki/Network_Configuration#_routed_configuration
Just change the interfaces and ips to yours.
Kind regards,
Daniel
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.