Search results

  1. R

    Benchmark: 3 node AMD EPYC 7742 64-Core, 512G RAM, 3x3 6,4TB Micron 9300 MAX NVMe

    Puuuh, just checked the other two. Luckily they show "NUMA node(s): 1" as well...
  2. R

    Benchmark: 3 node AMD EPYC 7742 64-Core, 512G RAM, 3x3 6,4TB Micron 9300 MAX NVMe

    Hi Alwin, The attachment to the initial post contains the output of lspcu showing "NUMA node(s): 1" qperf was indeed new to me - will have a look at it.
  3. R

    Benchmark: 3 node AMD EPYC 7742 64-Core, 512G RAM, 3x3 6,4TB Micron 9300 MAX NVMe

    Took some time, but found the difference between the hosts: On the first two from previous tests some /etc/sysctl.d/ parameter files had been left behind...
  4. R

    Benchmark: 3 node AMD EPYC 7742 64-Core, 512G RAM, 3x3 6,4TB Micron 9300 MAX NVMe

    Result from the over night run proxmox05 seems flaky, proxmox06 seems to uses less CPU than the rest. We need to check for configuration missmatches, it seems the nodes still have differences in their configuration.
  5. R

    Benchmark: 3 node AMD EPYC 7742 64-Core, 512G RAM, 3x3 6,4TB Micron 9300 MAX NVMe

    Adjusted the network settings according to AMD Network Tuning Guide . We did not use NUMA adjustments as we only have single socket CPUs. So this is a Zabbix screen over the three nodes. Details over the time ranges: - From 07:00 - 10:25: Performancetest running over the weekend. promox04...
  6. R

    Benchmark: 3 node AMD EPYC 7742 64-Core, 512G RAM, 3x3 6,4TB Micron 9300 MAX NVMe

    First test is to see if the network configuration is in order. 100GBit is 4 network streams combined, so at least 4 processes are required to test for the maximum. root@proxmox04:~# iperf -c 10.33.0.15 -P 4 ------------------------------------------------------------ Client connecting to...
  7. R

    Benchmark: 3 node AMD EPYC 7742 64-Core, 512G RAM, 3x3 6,4TB Micron 9300 MAX NVMe

    Hi everbody, we are currently in the process of replacing our VMware ESXi NFS Netapp setup with a ProxMox Ceph configuration. We purchased 8 nodes with the following configuration: - ThomasKrenn 1HE AMD Single-CPU RA1112 - AMD EPYC 7742 (2,25 GHz, 64-Core, 256 MB) - 512 GB RAM - 2x 240GB SATA...
  8. R

    Proxmox VE Ceph Benchmark 2020/09 - hyper-converged with NVMe

    Thanks for this second benchmark! It gives a clear impression what should be achievable with current hardware. I am currently trying to run that exact benchmark setup on a three node cluster and have problems running three rados bench clients simultaneously. Can you adjust the PDF and give...
  9. R

    [SOLVED] Ceph Module Zabbix "failed to send data..."

    Ich hänge mich mal hier dran, weil ich nach dem Upgrade von 5.4 auf 6.1 den gleichen Fehler hatte. Die Konfiguration hat vor dem Upgrade bereits funktioniert, nach dem Upgrade kam "Failed to send data to Zabbix" und der Ceph Status war dann auf Warning. Problem bei uns war die noch fehlenden...
  10. R

    Failing to migrate from vSphere to Proxmox

    Maybe read this: https://forum.proxmox.com/threads/converting-a-windows-vm-from-vmware-nfs-on-netapp-to-proxmox-ceph-with-minimal-downtime.51194/#post-237757
  11. R

    fstrim for Windows Guests and Ceph Storage

    Hi, after reading your posts I reconfigured my Windows 7 and Server 2008r2 systems to use SATA on Ceph RBD storage. ... bootdisk: sata0 ostype: win7 sata0: ceph-proxmox-VMs:vm-106-disk-0,cache=writeback,discard=on,size=30G scsihw: virtio-scsi-pci ... Using...
  12. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    So the next conversion with another VM just went smoothly with only two reboots. Steps: Create new VM on Proxmox Create IDE raw Harddisk on the NetApp NFS storage, samesize as on the to be converted machine Create one small SCSI qcow2 Disk to be able to properly install VirtIO-SCSI drivers...
  13. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    By setting the Cache on the Harddisk from "Default (no cache)" to "Write through" the live migration works now... WTF?!?! So, cool!!! Thanks Udo for the excellent hint! I will try now yet another VM to check if I can reproduce the minimized downtime approach.
  14. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    Hi Udo, cool suggestion! Tried it... So the filesystem is actually accessible within the VM - but the disk move borks now out with a different error: Online Migration: create full clone of drive scsi1 (netapp03-DS1:111/vm-111-disk-0.raw) drive mirror is starting for drive-scsi1 drive-scsi1...
  15. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    Seems like the issue with the not working offline copy is the creation of the destination raw file with the wrong size. From the gui: Virtual Environment 5.3-8 Storage 'ceph-proxmox-VMs' on node 'proxmox01' Search: Logs () create full clone of drive scsi0 (netapp03-DS1:111/vm-111-disk-0.vmdk)...
  16. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    Hi, moving it offline to the final destination also works root@proxmox01:/var/lib/vz/images# /usr/bin/qemu-img convert -p -f vmdk -O raw /mnt/pve/netapp03-DS1/images/111/vm-111-disk-0.vmdk...
  17. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    Hi, thanks for getting back so quickly! Converting onto same storage: Virtual Environment 5.3-8 Virtual Machine 111 (win2008r2) on node 'proxmox01' Logs scsi0 create full clone of drive scsi0 (netapp03-DS1:111/vm-111-disk-0.vmdk) Formatting...
  18. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    Hi, tried to move a disk just now with the current ProxMox (5.3-8). Online: Virtual Environment 5.3-8 Virtual Machine 111 (win2008r2) on node 'proxmox01' Logs create full clone of drive ide0 (netapp03-DS1:111/vm-111-disk-0.vmdk) drive mirror is starting for drive-ide0 drive-ide0: transferred...
  19. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    Hi virtRoo, after starting this thread yesterday I upgraded to 5.3-8. I will try another VM today... I understand that I am moving from one virtualisation platform to another one so I have no problem with a short downtime. But most of my live VMs are rather big and I need some best practise...
  20. R

    [SOLVED] Converting a Windows VM from VMware (NFS on NetApp) to ProxMox (Ceph) with minimal downtime

    Hi, we built a three node ProxMox cluster with Ceph as storage backend for our test VMs. They currently reside on a VMware two node cluster with an old NetApp as storage backend. Plan is to try to migrate/convert - as preparation for a possible migration of our productive VMs - with minimal...

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!