Recent content by Progratron

  1. Windows VMs bluescreening with Proxmox 6.1

    So just a short update: I have managed to load an old kernel and running 5.0.21-5-pve. It has no problems. Actual kernel 5.3.18-2-pve produces for me continuous crashes and blue screens with all possible types of messages under WS2019 on two different nodes. Different CPUs, different systems...
  2. Windows VMs bluescreening with Proxmox 6.1

    Experiencing exactly the same on all WS2019 machines on 2 different nodes since the update yesterday :( Anybody has a clue how can I load the previous kernel on a headless machine?
  3. Proxmox VE 6.1 released!

    Buy subscription :) Or delete this: /etc/apt/sources.list.d/pve-enterprise.list And add this: deb buster pve-no-subscription to /etc/apt/sources.list
  4. [SOLVED] PVE 6.0/corosync over WAN (high latency) - looses sync

    Can confirm working again stable now :) Thanks for your assistance. And yes, cluster over WAN not recomended, but works :)
  5. [SOLVED] PVE 6.0/corosync over WAN (high latency) - looses sync

    Done. Running libknet1: 1.13-pve1 now. Will monitor and update now.
  6. [SOLVED] PVE 6.0/corosync over WAN (high latency) - looses sync

    Sure. Sorry. See below. By the way, as I saw this packages list I remembered that yesterday as I tried to solve the problem besides the above-mentioned change in corosync.conf I've also updated nodes with apt. To be sure I am not giving you the log where something is already fixed with updates...
  7. [SOLVED] PVE 6.0/corosync over WAN (high latency) - looses sync

    pve-manager/6.0-4/2a719255 (running kernel: 5.0.15-1-pve)
  8. [SOLVED] PVE 6.0/corosync over WAN (high latency) - looses sync

    Thanks for your attention. Meanwhile, I've altered corosync.conf and increased token value to 10000 as advised here. This change definitely didn't solve the problem completely (it still crashes), but subjectively it feels to happen not that often now (I might be wrong at this). I tried to...
  9. [SOLVED] PVE 6.0/corosync over WAN (high latency) - looses sync

    I am running PVE cluster over WAN (different datacenters across the globe). It worked all the time flawlessly and best suited my needs (of course no shared storage, LM or HA but still central management, easy offline migrations etc). Some time ago I've upgraded to PVE 6.0 and was able to run the...
  10. Cluster over high latency WAN?

    Same thing happening to me since upgrade to PVE 6. Sometimes (not that often though as you describe) I find my cluster as "broken" due to corosync issues. What I do on "disconnected" nodes is simply: killall -9 corosync systemctl restart pve-cluster Than it works flawlessly. It seems that...
  11. Getrenntes Backup Node – Storage Typ/System Überlegungen

    Ja, die Wege kenne ich. Borgbackup nutzte ich auch ziemlich oft für andere Zwecke. Meine Überlegung war, vielleicht gibt es was über GUI oder so "Standardmäßig" ;) Aber OKay, ich probiere einfach mit dockerized Samba. Für große Dateien, was VM-Backups sind, ist SMB auch übers WAN schnell genug.
  12. Getrenntes Backup Node – Storage Typ/System Überlegungen

    Es sind gemietete Server in verschiedenen Rechenzentren, mehrere in EU, paar in USA und in einigen anderen Ländern. Überhaupt nicht. Durchschnittlich liegen wir bei ca. 25-30ms, aber es geben auch einige Stellen wo wir schon bei 100ms sind. Problemen mit Cluster hatte ich nie gehabt bis jetzt...
  13. Getrenntes Backup Node – Storage Typ/System Überlegungen

    Hallo, Ich nutze PVE Cluster, jedoch ohne Shared-Storage, da sich die Knoten in verschiedenen Rechenzentren auf der ganzen Welt befinden. Meistens fungiert der Cluster als zentrale Verwaltungslösung und nützliches GUI-Tool für VM-nicht-Live-Migrationen. Ich plane die Installation eines...
  14. Separate backup node - Storage system considerations

    Hello, I am running PVE Cluster, but without any shared storage yet, since the nodes are located in different data centers around the globe. Mostly the cluster works as a central management solution and useful GUI tool for VM non-live migrations. I am planning to install a separate node just...


The Proxmox community has been around for many years and offers help and support for Proxmox VE and Proxmox Mail Gateway. We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get your own in 60 seconds.

Buy now!