Move Harddisk from storage A to B takes ages

jgswinkels

Member
Jul 20, 2021
10
0
6
61
Hi all,
Proxmox 8.2.4 up and running incl. all updates.
Qnap Nas as source
PCIe M2 4TB SSD as destination.
IO delay ~ 0,1 to 0,2 0%
Shutdown VM
Move harddisk from QNAP NAS to PCIe M2 4TB SSD.
FileSize = 64 GB
copy started ~ 1GB / sec
IO delay up to ~ 10%
Now stil not ready after 1 HOUR !!!
create full clone of drive scsi0 (qnap-data:120/vm-120-disk-0.qcow2)
Logical volume "vm-120-disk-0" created.
transferred 0.0 B of 64.0 GiB (0.00%)
transferred 661.9 MiB of 64.0 GiB (1.01%)
transferred 1.3 GiB of 64.0 GiB (2.01%)
transferred 1.9 GiB of 64.0 GiB (3.01%)
transferred 2.6 GiB of 64.0 GiB (4.01%)
transferred 3.2 GiB of 64.0 GiB (5.02%)
transferred 3.9 GiB of 64.0 GiB (6.02%)
transferred 4.5 GiB of 64.0 GiB (7.02%)
transferred 5.1 GiB of 64.0 GiB (8.02%)
.
.
transferred 61.0 GiB of 64.0 GiB (95.36%)
transferred 61.7 GiB of 64.0 GiB (96.36%)
transferred 62.3 GiB of 64.0 GiB (97.37%)
transfer seems to be stuck ??
.
.
copy started ~ 1GB / sec now 1 hour later stil disk I/O ~ 8 .. 10 %

This is what I see in system logfile:


Aug 29 11:17:01 pve-p-102 CRON[4099237]: pam_unix(cron:session): session closed for user root
Aug 29 11:35:06 pve-p-102 smartd[1196]: Device: /dev/sda [SAT], SMART Usage Attribute: 190 Airflow_Temperature_Cel changed from 66 to 65
Aug 29 11:35:06 pve-p-102 smartd[1196]: Device: /dev/sdb [SAT], SMART Usage Attribute: 190 Airflow_Temperature_Cel changed from 69 to 68
Aug 29 12:05:06 pve-p-102 smartd[1196]: Device: /dev/sda [SAT], SMART Usage Attribute: 190 Airflow_Temperature_Cel changed from 65 to 67
Aug 29 12:05:06 pve-p-102 smartd[1196]: Device: /dev/sdb [SAT], SMART Usage Attribute: 190 Airflow_Temperature_Cel changed from 68 to 70
Aug 29 12:05:07 pve-p-102 pmxcfs[1519]: [dcdb] notice: data verification successful
Aug 29 12:13:59 pve-p-102 systemd[1]: Starting apt-daily.service - Daily apt download activities...
Aug 29 12:14:00 pve-p-102 systemd[1]: apt-daily.service: Deactivated successfully.
Aug 29 12:14:00 pve-p-102 systemd[1]: Finished apt-daily.service - Daily apt download activities.
Aug 29 12:17:01 pve-p-102 CRON[4117025]: pam_unix(cron:session): session opened for user root(uid=0) by (uid=0)
Aug 29 12:17:01 pve-p-102 CRON[4117026]: (root) CMD (cd / && run-parts --report /etc/cron.hourly)
Aug 29 12:17:01 pve-p-102 CRON[4117025]: pam_unix(cron:session): session closed for user root
Aug 29 12:26:08 pve-p-102 pvedaemon[1732761]: <root@pam> successful auth for user 'root@pam'
Aug 29 12:31:49 pve-p-102 pvedaemon[1738405]: <root@pam> starting task UPID:pve-p-102:003EE416:0A233CAE:66D04E15:qmmove:120:root@pam:
Aug 29 12:31:49 pve-p-102 pvedaemon[4121622]: <root@pam> move disk VM 120: move --disk scsi0 --storage PCIe-1-0168
Aug 29 12:35:06 pve-p-102 smartd[1196]: Device: /dev/sda [SAT], SMART Usage Attribute: 190 Airflow_Temperature_Cel changed from 67 to 66
Aug 29 12:35:06 pve-p-102 smartd[1196]: Device: /dev/sdb [SAT], SMART Usage Attribute: 190 Airflow_Temperature_Cel changed from 70 to 69
Aug 29 12:40:09 pve-p-102 pveproxy[1783053]: worker exit
Aug 29 12:40:09 pve-p-102 pveproxy[1679]: worker 1783053 finished
Aug 29 12:40:09 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 12:40:09 pve-p-102 pveproxy[1679]: worker 4124151 started
Aug 29 12:40:33 pve-p-102 pvedaemon[1745449]: worker exit
Aug 29 12:40:33 pve-p-102 pvedaemon[1668]: worker 1745449 finished
Aug 29 12:40:33 pve-p-102 pvedaemon[1668]: starting 1 worker(s)
Aug 29 12:40:33 pve-p-102 pvedaemon[1668]: worker 4124270 started
Aug 29 12:40:54 pve-p-102 pvedaemon[1732761]: <root@pam> successful auth for user 'root@pam'
Aug 29 12:41:27 pve-p-102 pvestatd[1652]: status update time (5.806 seconds)
Aug 29 12:42:38 pve-p-102 pveproxy[1783052]: worker exit
Aug 29 12:42:38 pve-p-102 pveproxy[1679]: worker 1783052 finished
Aug 29 12:42:38 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 12:42:38 pve-p-102 pveproxy[1679]: worker 4124873 started
Aug 29 12:42:49 pve-p-102 pvedaemon[1732761]: worker exit
Aug 29 12:42:49 pve-p-102 pvedaemon[1668]: worker 1732761 finished
Aug 29 12:42:49 pve-p-102 pvedaemon[1668]: starting 1 worker(s)
Aug 29 12:42:49 pve-p-102 pvedaemon[1668]: worker 4124921 started
Aug 29 12:43:09 pve-p-102 pvedaemon[4124270]: <root@pam> successful auth for user 'root@pam'
Aug 29 12:43:37 pve-p-102 pvedaemon[1738405]: worker exit
Aug 29 12:43:37 pve-p-102 pvedaemon[1668]: worker 1738405 finished
Aug 29 12:43:37 pve-p-102 pvedaemon[1668]: starting 1 worker(s)
Aug 29 12:43:37 pve-p-102 pvedaemon[1668]: worker 4125170 started
Aug 29 12:44:36 pve-p-102 pveproxy[1783051]: worker exit
Aug 29 12:44:36 pve-p-102 pveproxy[1679]: worker 1783051 finished
Aug 29 12:44:36 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 12:44:36 pve-p-102 pveproxy[1679]: worker 4125476 started
Aug 29 12:52:24 pve-p-102 pveproxy[4124873]: worker exit
Aug 29 12:52:24 pve-p-102 pveproxy[1679]: worker 4124873 finished
Aug 29 12:52:24 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 12:52:24 pve-p-102 pveproxy[1679]: worker 4127744 started
Aug 29 12:55:55 pve-p-102 pvedaemon[4124270]: <root@pam> successful auth for user 'root@pam'
Aug 29 12:56:31 pve-p-102 pveproxy[4124151]: worker exit
Aug 29 12:56:31 pve-p-102 pveproxy[1679]: worker 4124151 finished
Aug 29 12:56:31 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 12:56:31 pve-p-102 pveproxy[1679]: worker 4129129 started
Aug 29 12:58:10 pve-p-102 pvedaemon[4124270]: <root@pam> successful auth for user 'root@pam'
Aug 29 13:00:09 pve-p-102 pveproxy[4125476]: worker exit
Aug 29 13:00:09 pve-p-102 pveproxy[1679]: worker 4125476 finished
Aug 29 13:00:09 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 13:00:09 pve-p-102 pveproxy[1679]: worker 4130582 started
Aug 29 13:03:31 pve-p-102 pvedaemon[4124270]: worker exit
Aug 29 13:03:31 pve-p-102 pvedaemon[1668]: worker 4124270 finished
Aug 29 13:03:31 pve-p-102 pvedaemon[1668]: starting 1 worker(s)
Aug 29 13:03:31 pve-p-102 pvedaemon[1668]: worker 4131968 started
Aug 29 13:04:06 pve-p-102 pvedaemon[4132124]: starting termproxy UPID:pve-p-102:003F0D1C:0A263141:66D055A6:vncshell::root@pam:
Aug 29 13:04:06 pve-p-102 pvedaemon[4125170]: <root@pam> starting task UPID:pve-p-102:003F0D1C:0A263141:66D055A6:vncshell::root@pam:
Aug 29 13:04:06 pve-p-102 pvedaemon[4131968]: <root@pam> successful auth for user 'root@pam'
Aug 29 13:04:12 pve-p-102 systemd[1]: Stopping systemd-binfmt.service - Set Up Additional Binary Formats...
Aug 29 13:04:12 pve-p-102 systemd[1]: systemd-binfmt.service: Deactivated successfully.
Aug 29 13:04:12 pve-p-102 systemd[1]: Stopped systemd-binfmt.service - Set Up Additional Binary Formats.
Aug 29 13:04:12 pve-p-102 systemd[1]: Starting systemd-binfmt.service - Set Up Additional Binary Formats...
Aug 29 13:04:12 pve-p-102 systemd[1]: Finished systemd-binfmt.service - Set Up Additional Binary Formats.
Aug 29 13:04:12 pve-p-102 pveupgrade[4132127]: update new package list: /var/lib/pve-manager/pkgupdates
Aug 29 13:04:19 pve-p-102 pvedaemon[4125170]: <root@pam> end task UPID:pve-p-102:003F0D1C:0A263141:66D055A6:vncshell::root@pam: OK
Aug 29 13:05:07 pve-p-102 pmxcfs[1519]: [dcdb] notice: data verification successful
Aug 29 13:06:38 pve-p-102 pveproxy[4129129]: worker exit
Aug 29 13:06:38 pve-p-102 pveproxy[1679]: worker 4129129 finished
Aug 29 13:06:38 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 13:06:38 pve-p-102 pveproxy[1679]: worker 4133059 started
Aug 29 13:06:48 pve-p-102 pveproxy[4127744]: worker exit
Aug 29 13:06:48 pve-p-102 pveproxy[1679]: worker 4127744 finished
Aug 29 13:06:48 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 13:06:48 pve-p-102 pveproxy[1679]: worker 4133090 started
Aug 29 13:07:21 pve-p-102 pvedaemon[4125170]: worker exit
Aug 29 13:07:21 pve-p-102 pvedaemon[1668]: worker 4125170 finished
Aug 29 13:07:21 pve-p-102 pvedaemon[1668]: starting 1 worker(s)
Aug 29 13:07:21 pve-p-102 pvedaemon[1668]: worker 4133254 started
Aug 29 13:08:31 pve-p-102 pvedaemon[4124921]: worker exit
Aug 29 13:08:31 pve-p-102 pvedaemon[1668]: worker 4124921 finished
Aug 29 13:08:31 pve-p-102 pvedaemon[1668]: starting 1 worker(s)
Aug 29 13:08:31 pve-p-102 pvedaemon[1668]: worker 4133646 started
Aug 29 13:11:04 pve-p-102 pvedaemon[4133254]: <root@pam> successful auth for user 'root@pam'
Aug 29 13:13:10 pve-p-102 pvedaemon[4131968]: <root@pam> successful auth for user 'root@pam'
Aug 29 13:13:54 pve-p-102 pveproxy[4130582]: worker exit
Aug 29 13:13:54 pve-p-102 pveproxy[1679]: worker 4130582 finished
Aug 29 13:13:54 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 13:13:54 pve-p-102 pveproxy[1679]: worker 4135214 started
Aug 29 13:17:01 pve-p-102 CRON[4136565]: pam_unix(cron:session): session opened for user root(uid=0) by (uid=0)
Aug 29 13:17:01 pve-p-102 CRON[4136566]: (root) CMD (cd / && run-parts --report /etc/cron.hourly)
Aug 29 13:17:01 pve-p-102 CRON[4136565]: pam_unix(cron:session): session closed for user root
Aug 29 13:23:00 pve-p-102 pveproxy[4133059]: worker exit
Aug 29 13:23:00 pve-p-102 pveproxy[1679]: worker 4133059 finished
Aug 29 13:23:00 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 13:23:00 pve-p-102 pveproxy[1679]: worker 4138663 started
Aug 29 13:23:00 pve-p-102 pveproxy[1679]: worker 4133090 finished
Aug 29 13:23:00 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 13:23:00 pve-p-102 pveproxy[1679]: worker 4138664 started
Aug 29 13:23:01 pve-p-102 pveproxy[4138662]: worker exit
Aug 29 13:26:04 pve-p-102 pvedaemon[4133254]: <root@pam> successful auth for user 'root@pam'
Aug 29 13:27:48 pve-p-102 pvestatd[1652]: status update time (5.558 seconds)
Aug 29 13:27:52 pve-p-102 pveproxy[1679]: worker 4135214 finished
Aug 29 13:27:52 pve-p-102 pveproxy[1679]: starting 1 worker(s)
Aug 29 13:27:52 pve-p-102 pveproxy[1679]: worker 4140545 started
Aug 29 13:27:53 pve-p-102 pveproxy[4140544]: worker exit
Aug 29 13:28:10 pve-p-102 pvedaemon[4131968]: <root@pam> successful auth for user 'root@pam'


Question: What can be wrong ??
 
Last edited:

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!