Migration problems when updating, quick tip

BloodyIron

Renowned Member
Jan 14, 2013
229
13
83
it.lanified.com
So one of the clusters I work with has been in operations for going on 3 years now. It's a 2-node that was recently expanded to 4-node, but the added 2 nodes are for lab work. The first 2 nodes have a 2 votes each, so nodes 3 and 4 can be turned off when lab work is not being done.

Now, in the past I've migrated VMs from node 2 to 1 (or in either order), then upgraded the "empty" host, rebooted it, then migrated all the VMs onto the updated host. Then I would update the other host that is now "empty" (no VMs running ), reboot it after updates, then move the VMs back on it that normally run on it.

Except, today I had a bit of a hiccup, but found the solution.

Node 1, migrated all the VMs off it, updated it (it was only a few weeks out of date), rebooted it.

Then on Node 2, tried to migrate VMs back to Node 1, but they kept failing for no apparent reason. But the solution was to apply updates to Node 2 while it was running, without rebooting, then migrate the VMs off (live). Then when Node 2 was "empty" reboot it".

This did the trick, I could now live migrate VMs after doing the update on Node 2 without rebooting it.

This is proxmox ve 4.3, current as of this writing.
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!