VM's not up after update

Maurice Vosmeijer

New Member
May 8, 2018
2
0
1
51
Hello,

We are using 3x an 16 core node all running prox 5.1.41 installed from the installation CD using the zfx filesysten with a raid on 3 1Tb SSD disks. This morning I saw that no vm on the first node was running. The only message in the log at the bottom of the prox interface mentions 'Start all VM's and containers' and a couple of 'Update packages database' messages of the past couple of days.

log:
waiting for quorum
get quorum
TASK ok

I have started all the VM's manually and didn't see any issue when I did. So I wonder why they where not started, even when the UI log states that they should have been started

I like to figure out why this happend and more importantly how I might prevent this from happening in the future.or at least get a heads-up.

Anyone have an idea.
 
Upon further investigation I found out that there is a option with which you can specify if a vm should be started or not on boot. All the vm's on that node had this option set to 'no'. So most likely the issue is that the server rebooted and the nodes where not started because it was specified that they shouldn't be.

I have changed the option for all vm's that should be started automatically. Hopefully this will solve any future issues.
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!