Occasional reboot Issue

tutomasz

Member
Dec 4, 2020
1
0
6
Hello,
read many thing's before i decide to write here an request for help...
I'll try to write simply and chronological.

SPEC: just 1 node with 2 machines - Windows Server 2016 + Debian buster 10.7
HP PROliant DL380p G8
CPU
- E5-2667 x2
KERNEL - 5.4.73-1 (16.11.2020)
PVE - 6.3-2
UPDATE - disable

So everything start in december 2020, after 2weeks test i decide to move enivorment to new machine first time in proxmox (homelab doesnt count).
/in tests i check machine fully, upgrade everythink and change some bios settings like watchdog's which like to reboot especial proxmox soft/

First problem occur after 4 weeks from START - about 30days uptime, like a charm! in logs just hourly cron jobs
Machine just reboot themself (11.01.2021 23:50)
Tried to understand what's going on, logs were "clear" maybe ram? I decide to doing weekly reboot after backup.. OK

Again machine working super smooth I almost forgot about last event but... yesterday (day of weekly backup/reboot)
Machine just reboot themself (13.02.2021 19:46)

Log's look the same like last time, but it musn't be an accident so i ask for help
I KNOW there is in logs some messages like communictaing to TPM (need to disable secure bot) but it shouldn't doing stuff like that..

Would be greate if someone advanced look at log's. Attaching every important (hope).

Greetings
Tomasz


-- Logs begin at Mon 2021-01-11 23:49:21 CET, end at Sun 2021-02-14 09:58:00 CET. --
Feb 14 09:27:46 pve pveproxy[1638]: got inotify poll request in wrong process - disabling inotify
Feb 13 19:47:03 pve iscsid[1086]: iSCSI daemon with pid=1087 started!
Feb 13 19:47:01 pve smartd[974]: In the system's table of devices NO devices found to scan
Feb 13 19:47:00 pve kernel: ima: Error Communicating to TPM chip
Feb 13 19:47:00 pve kernel: ima: Error Communicating to TPM chip
Feb 13 19:47:00 pve kernel: ima: Error Communicating to TPM chip
Feb 13 19:47:00 pve kernel: ima: Error Communicating to TPM chip
Feb 13 19:47:00 pve kernel: ima: Error Communicating to TPM chip
Feb 13 19:47:00 pve kernel: ima: Error Communicating to TPM chip
Feb 13 19:47:00 pve kernel: ima: Error Communicating to TPM chip
Feb 13 19:47:00 pve kernel: ima: Error Communicating to TPM chip
Feb 13 19:46:59 pve kernel: [Firmware Bug]: the BIOS has corrupted hw-PMU resources (MSR 38d is 330)
Feb 13 19:46:59 pve kernel: ACPI: SPCR: Unexpected SPCR Access Width. Defaulting to byte size
-- Reboot --
Feb 13 14:09:14 pve pvedaemon[1324]: <root@pam> end task UPID:pve:00000806:00006A3A:6027CF7A:vncproxy:100:root@pam: Failed to run vncproxy.
Feb 13 14:09:14 pve pvedaemon[2054]: Failed to run vncproxy.
Feb 13 14:09:14 pve qm[2058]: VM 100 qmp command failed - VM 100 not running
Feb 13 14:04:50 pve iscsid[1076]: iSCSI daemon with pid=1077 started!
Feb 13 14:04:49 pve smartd[981]: In the system's table of devices NO devices found to scan
Feb 13 14:04:47 pve kernel: ima: Error Communicating to TPM chip
Feb 13 14:04:47 pve kernel: ima: Error Communicating to TPM chip
Feb 13 14:04:47 pve kernel: ima: Error Communicating to TPM chip
Feb 13 14:04:47 pve kernel: ima: Error Communicating to TPM chip
Feb 13 14:04:47 pve kernel: ima: Error Communicating to TPM chip
Feb 13 14:04:47 pve kernel: ima: Error Communicating to TPM chip
Feb 13 14:04:47 pve kernel: ima: Error Communicating to TPM chip
Feb 13 14:04:47 pve kernel: ima: Error Communicating to TPM chip
Feb 13 14:04:47 pve kernel: [Firmware Bug]: the BIOS has corrupted hw-PMU resources (MSR 38d is 330)
Feb 13 14:04:47 pve kernel: ACPI: SPCR: Unexpected SPCR Access Width. Defaulting to byte size
-- Reboot --
Feb 13 14:01:59 pve kernel: watchdog: watchdog0: watchdog did not stop!
Feb 13 12:43:54 pve pvestatd[1294]: VM 100 qmp command failed - VM 100 not running
Feb 13 12:14:33 pve pvedaemon[1316]: <root@pam> end task UPID:pve:000010F4:030682A6:6027B498:vncproxy:100:root@pam: Failed to run vncproxy.
Feb 13 12:14:33 pve pvedaemon[4340]: Failed to run vncproxy.
Feb 13 12:14:33 pve qm[4342]: VM 100 qmp command failed - VM 100 not running
Feb 13 12:14:32 pve pvedaemon[1316]: VM 100 qmp command failed - VM 100 not running
Feb 13 12:14:29 pve QEMU[3362]: kvm: terminating on signal 15 from pid 964 (/usr/sbin/qmeventd)
Feb 13 12:10:27 pve pvedaemon[1317]: <root@pam> end task UPID:pve:00000E42:030622B6:6027B3A2:vncproxy:111:root@pam: Failed to run vncproxy.
Feb 13 12:10:27 pve pvedaemon[3650]: Failed to run vncproxy.
Feb 13 12:10:27 pve qm[3657]: VM 111 qmp command failed - VM 111 not running
Feb 13 12:10:26 pve pvedaemon[1315]: VM 111 qmp command failed - unable to open monitor socket
Feb 13 12:10:25 pve QEMU[3445]: kvm: terminating on signal 15 from pid 964 (/usr/sbin/qmeventd)
Feb 07 15:14:57 pve iscsid[1069]: iSCSI daemon with pid=1070 started!
Feb 07 15:14:56 pve smartd[958]: In the system's table of devices NO devices found to scan
Feb 07 15:14:54 pve kernel: ima: Error Communicating to TPM chip
Feb 07 15:14:54 pve kernel: ima: Error Communicating to TPM chip
Feb 07 15:14:54 pve kernel: ima: Error Communicating to TPM chip
Feb 07 15:14:54 pve kernel: ima: Error Communicating to TPM chip
Feb 07 15:14:54 pve kernel: ima: Error Communicating to TPM chip
Feb 07 15:14:54 pve kernel: ima: Error Communicating to TPM chip
Feb 07 15:14:54 pve kernel: ima: Error Communicating to TPM chip
Feb 07 15:14:54 pve kernel: ima: Error Communicating to TPM chip
Feb 07 15:14:54 pve kernel: [Firmware Bug]: the BIOS has corrupted hw-PMU resources (MSR 38d is 330)
Feb 07 15:14:54 pve kernel: ACPI: SPCR: Unexpected SPCR Access Width. Defaulting to byte size
 
PVE - 6.3-2
UPDATE - disable

Please upgrade to the lastest version. There have been some changes for qmeventd which I assume to be relevant for your situation.
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!