Hello,
I'm currently configuring a new Proxmox hypervisor with the following setup:
2x 1TB NVMe drives in RAID1 ZFS
8x 8TB NVMe drives in RAID10 ZFS
The 8TB NVMe drives we are using:
Intel DC4510 Series
In dmesg I see the following lines:
PVEversion:
Kernel:
Anyone which experience the same issue?
I'm currently configuring a new Proxmox hypervisor with the following setup:
2x 1TB NVMe drives in RAID1 ZFS
8x 8TB NVMe drives in RAID10 ZFS
The 8TB NVMe drives we are using:
Intel DC4510 Series
In dmesg I see the following lines:
Code:
[Wed Apr 19 16:42:44 2023] nvme nvme0: I/O 321 QID 21 timeout, completion polled
[Wed Apr 19 16:43:25 2023] nvme nvme3: I/O 64 QID 29 timeout, completion polled
[Wed Apr 19 16:44:08 2023] nvme nvme1: I/O 832 QID 19 timeout, completion polled
[Wed Apr 19 16:45:36 2023] nvme nvme8: I/O 577 QID 14 timeout, completion polled
[Wed Apr 19 16:47:16 2023] nvme nvme2: I/O 960 QID 28 timeout, completion polled
[Wed Apr 19 16:47:47 2023] nvme nvme0: I/O 576 QID 7 timeout, completion polled
[Wed Apr 19 16:48:32 2023] nvme nvme8: I/O 512 QID 23 timeout, completion polled
[Wed Apr 19 16:49:03 2023] nvme nvme6: I/O 576 QID 23 timeout, completion polled
[Wed Apr 19 16:49:42 2023] nvme nvme0: I/O 576 QID 17 timeout, completion polled
[Wed Apr 19 16:50:12 2023] nvme nvme1: I/O 320 QID 2 timeout, completion polled
[Wed Apr 19 16:50:12 2023] nvme nvme0: I/O 576 QID 16 timeout, completion polled
[Wed Apr 19 16:50:43 2023] nvme nvme2: I/O 449 QID 24 timeout, completion polled
[Wed Apr 19 17:00:37 2023] nvme nvme8: I/O 512 QID 6 timeout, completion polled
[Wed Apr 19 17:01:32 2023] nvme nvme9: I/O 896 QID 22 timeout, completion polled
[Wed Apr 19 17:03:04 2023] nvme nvme9: I/O 832 QID 6 timeout, completion polled
[Wed Apr 19 17:03:43 2023] nvme nvme0: I/O 577 QID 16 timeout, completion polled
[Wed Apr 19 17:04:24 2023] nvme nvme2: I/O 576 QID 1 timeout, completion polled
PVEversion:
Code:
pve-manager/7.4-3/9002ab8a (running kernel: 5.19.17-2-pve)
Kernel:
Code:
Linux 5.19.17-2-pve #1 SMP PREEMPT_DYNAMIC PVE 5.19.17-2 (Sat, 28 Jan 2023 16:40:25 x86_64 GNU/Linux
Anyone which experience the same issue?