Migration Problem

Nov 8, 2017
99
3
13
33
Muscat
Hi guys
Today I've found out that I need to update my nodes and reboot them. In order to prevent downtime in my production servers, I was live migrating VMs one by one to other nodes in my cluster.

and migration problem occurred while migrating two of my VMs. They got rebooted and we experienced 2 minutes of downtime.
I've just found this in error log as "migration problem"

2018-05-17 16:55:54 migration status: completed
2018-05-17 16:55:56 ERROR: tunnel replied 'ERR: resume failed - VM 130 not running' to command 'resume 130'
2018-05-17 16:56:24 ERROR: migration finished with problems (duration 00:02:42)
TASK ERROR: migration problems


Any Idea ?
 
Any Idea ?

It looks like the VM "died" during migration, that could be because you tried to migrate from a updated node to one running older versions, we and projects we use (e.e. QEMU/KVM) only guarantee that migrating from old to new works, vice versa can work but sometimes you may run into problems, that's not really avoidable, I'm afraid. Thus please either migrate before starting the update (preferred) or update all, then migrate one node free, reboot it, and do the same with the next node.

Could you please post the full task log, should be available through the WebUI.
 
It looks like the VM "died" during migration, that could be because you tried to migrate from a updated node to one running older versions, we and projects we use (e.e. QEMU/KVM) only guarantee that migrating from old to new works, vice versa can work but sometimes you may run into problems, that's not really avoidable, I'm afraid. Thus please either migrate before starting the update (preferred) or update all, then migrate one node free, reboot it, and do the same with the next node.

Could you please post the full task log, should be available through the WebUI.


Thank you for your response
I was migrating the VMs before I attempt to update and all of my nodes were running on the same version. I was going to free one node to upgrade it.
It happens completely randomly, VMs suddenly dies at the end of migration.
There is another interesting thing in the log is the average migration speed was "32.25 MB/s" which is quite slow, all of my servers are connected to each other and to the storage via 10Gbps Fiber and the migration speed is usually more than 100 MB/s. Could this be because of network load or something like that ?
Here is the full task log


task started by HA resource agent
2018-05-17 16:53:42 starting migration of VM 130 to node 'master3' (172.27.3.12)
2018-05-17 16:53:42 copying disk images
2018-05-17 16:53:42 starting VM 130 on remote node 'master3'
2018-05-17 16:53:46 start remote tunnel
2018-05-17 16:53:47 ssh tunnel ver 1
2018-05-17 16:53:47 starting online/live migration on unix:/run/qemu-server/130.migrate
2018-05-17 16:53:47 migrate_set_speed: 8589934592
2018-05-17 16:53:47 migrate_set_downtime: 0.1
2018-05-17 16:53:47 set migration_caps
2018-05-17 16:53:47 set cachesize: 536870912
2018-05-17 16:53:47 start migrate command to unix:/run/qemu-server/130.migrate
2018-05-17 16:53:48 migration status: active (transferred 70251553, remaining 4220518400), total 4312604672)
2018-05-17 16:53:48 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:49 migration status: active (transferred 130150487, remaining 4110000128), total 4312604672)
2018-05-17 16:53:49 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:50 migration status: active (transferred 195151431, remaining 3984736256), total 4312604672)
2018-05-17 16:53:50 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:51 migration status: active (transferred 285214946, remaining 3830980608), total 4312604672)
2018-05-17 16:53:51 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:52 migration status: active (transferred 327360025, remaining 3777282048), total 4312604672)
2018-05-17 16:53:52 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:53 migration status: active (transferred 383475298, remaining 3702120448), total 4312604672)
2018-05-17 16:53:53 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:54 migration status: active (transferred 462977098, remaining 3597139968), total 4312604672)
2018-05-17 16:53:54 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:55 migration status: active (transferred 546651622, remaining 3482877952), total 4312604672)
2018-05-17 16:53:55 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:56 migration status: active (transferred 586763781, remaining 3428134912), total 4312604672)
2018-05-17 16:53:56 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:57 migration status: active (transferred 612489946, remaining 3390275584), total 4312604672)
2018-05-17 16:53:57 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:58 migration status: active (transferred 627171258, remaining 3369476096), total 4312604672)
2018-05-17 16:53:58 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:53:59 migration status: active (transferred 641823908, remaining 3348664320), total 4312604672)
2018-05-17 16:53:59 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:00 migration status: active (transferred 658287701, remaining 3325460480), total 4312604672)
2018-05-17 16:54:00 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:01 migration status: active (transferred 667197056, remaining 3313057792), total 4312604672)
2018-05-17 16:54:01 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:02 migration status: active (transferred 673169155, remaining 3304894464), total 4312604672)
2018-05-17 16:54:02 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:03 migration status: active (transferred 677712019, remaining 3298627584), total 4312604672)
2018-05-17 16:54:03 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:04 migration status: active (transferred 683967384, remaining 3290140672), total 4312604672)
2018-05-17 16:54:04 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:05 migration status: active (transferred 687232555, remaining 3285757952), total 4312604672)
2018-05-17 16:54:05 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:06 migration status: active (transferred 708720511, remaining 3258998784), total 4312604672)
2018-05-17 16:54:06 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:07 migration status: active (transferred 712365923, remaining 3254886400), total 4312604672)
2018-05-17 16:54:07 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:08 migration status: active (transferred 723285027, remaining 3242876928), total 4312604672)
2018-05-17 16:54:08 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:09 migration status: active (transferred 725763977, remaining 3240345600), total 4312604672)
2018-05-17 16:54:09 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:10 migration status: active (transferred 732270536, remaining 3231215616), total 4312604672)
2018-05-17 16:54:10 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:11 migration status: active (transferred 743724150, remaining 3214495744), total 4312604672)
2018-05-17 16:54:11 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:12 migration status: active (transferred 877832963, remaining 2924929024), total 4312604672)
2018-05-17 16:54:12 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:13 migration status: active (transferred 1048388534, remaining 2677669888), total 4312604672)
2018-05-17 16:54:13 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:14 migration status: active (transferred 1259560438, remaining 2440622080), total 4312604672)
2018-05-17 16:54:14 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:15 migration status: active (transferred 1444577398, remaining 2194333696), total 4312604672)
2018-05-17 16:54:15 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:16 migration status: active (transferred 1486246530, remaining 2142896128), total 4312604672)
2018-05-17 16:54:16 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:17 migration status: active (transferred 1499816916, remaining 2126336000), total 4312604672)
2018-05-17 16:54:17 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:18 migration status: active (transferred 1512522448, remaining 2112004096), total 4312604672)
2018-05-17 16:54:18 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:19 migration status: active (transferred 1514690025, remaining 2109542400), total 4312604672)
2018-05-17 16:54:19 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:20 migration status: active (transferred 1521968323, remaining 2101420032), total 4312604672)
2018-05-17 16:54:20 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:21 migration status: active (transferred 1523807670, remaining 2099245056), total 4312604672)
2018-05-17 16:54:21 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:22 migration status: active (transferred 1561382106, remaining 2055122944), total 4312604672)
2018-05-17 16:54:22 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:23 migration status: active (transferred 1563094247, remaining 2053066752), total 4312604672)
2018-05-17 16:54:23 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:24 migration status: active (transferred 1566584022, remaining 2048966656), total 4312604672)
2018-05-17 16:54:24 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:25 migration status: active (transferred 1593810627, remaining 2017783808), total 4312604672)
2018-05-17 16:54:25 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:26 migration status: active (transferred 1630273851, remaining 1974329344), total 4312604672)
2018-05-17 16:54:26 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:27 migration status: active (transferred 1716014048, remaining 1835016192), total 4312604672)
2018-05-17 16:54:27 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:28 migration status: active (transferred 1765662749, remaining 1752616960), total 4312604672)
2018-05-17 16:54:28 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:29 migration status: active (transferred 1792265673, remaining 1714540544), total 4312604672)
2018-05-17 16:54:29 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:30 migration status: active (transferred 1820413149, remaining 1674268672), total 4312604672)
2018-05-17 16:54:30 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:31 migration status: active (transferred 1846775160, remaining 1630289920), total 4312604672)
2018-05-17 16:54:31 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:32 migration status: active (transferred 1882103753, remaining 1585127424), total 4312604672)
2018-05-17 16:54:32 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:33 migration status: active (transferred 1907472595, remaining 1551462400), total 4312604672)
2018-05-17 16:54:33 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:34 migration status: active (transferred 1925938576, remaining 1526505472), total 4312604672)
2018-05-17 16:54:34 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:35 migration status: active (transferred 1958884815, remaining 1484636160), total 4312604672)
2018-05-17 16:54:35 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:36 migration status: active (transferred 2091218856, remaining 1315086336), total 4312604672)
2018-05-17 16:54:36 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:37 migration status: active (transferred 2112500323, remaining 1287262208), total 4312604672)
2018-05-17 16:54:37 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:38 migration status: active (transferred 2130432980, remaining 1264619520), total 4312604672)
2018-05-17 16:54:38 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:39 migration status: active (transferred 2170953086, remaining 1214296064), total 4312604672)
2018-05-17 16:54:39 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:40 migration status: active (transferred 2212496238, remaining 1168027648), total 4312604672)
2018-05-17 16:54:40 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:41 migration status: active (transferred 2267206225, remaining 1096900608), total 4312604672)
2018-05-17 16:54:41 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:42 migration status: active (transferred 2310800982, remaining 1039450112), total 4312604672)
2018-05-17 16:54:42 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:43 migration status: active (transferred 2389254792, remaining 949207040), total 4312604672)
2018-05-17 16:54:43 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:44 migration status: active (transferred 2502446323, remaining 777039872), total 4312604672)
2018-05-17 16:54:44 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:45 migration status: active (transferred 2600391286, remaining 629514240), total 4312604672)
2018-05-17 16:54:45 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:46 migration status: active (transferred 2667984546, remaining 530231296), total 4312604672)
2018-05-17 16:54:46 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:47 migration status: active (transferred 2672379506, remaining 524177408), total 4312604672)
2018-05-17 16:54:47 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:48 migration status: active (transferred 2687765310, remaining 503279616), total 4312604672)
2018-05-17 16:54:48 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:49 migration status: active (transferred 2699879128, remaining 486146048), total 4312604672)
2018-05-17 16:54:49 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:50 migration status: active (transferred 2704340022, remaining 479903744), total 4312604672)
2018-05-17 16:54:50 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:51 migration status: active (transferred 2710259510, remaining 471445504), total 4312604672)
2018-05-17 16:54:51 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:52 migration status: active (transferred 2716039165, remaining 463261696), total 4312604672)
2018-05-17 16:54:52 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:53 migration status: active (transferred 2722196217, remaining 454778880), total 4312604672)
2018-05-17 16:54:53 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:54 migration status: active (transferred 2725272605, remaining 450580480), total 4312604672)
2018-05-17 16:54:54 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:55 migration status: active (transferred 2729906200, remaining 444014592), total 4312604672)
2018-05-17 16:54:55 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:56 migration status: active (transferred 2731372749, remaining 441909248), total 4312604672)
2018-05-17 16:54:56 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:57 migration status: active (transferred 2732728247, remaining 440025088), total 4312604672)
2018-05-17 16:54:57 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:58 migration status: active (transferred 2737201723, remaining 433647616), total 4312604672)
2018-05-17 16:54:58 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:54:59 migration status: active (transferred 2739945831, remaining 429715456), total 4312604672)
2018-05-17 16:54:59 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:00 migration status: active (transferred 2744419055, remaining 423452672), total 4312604672)
2018-05-17 16:55:00 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:01 migration status: active (transferred 2749179496, remaining 416931840), total 4312604672)
2018-05-17 16:55:01 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:02 migration status: active (transferred 2752170005, remaining 412684288), total 4312604672)
2018-05-17 16:55:02 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:03 migration status: active (transferred 2753755732, remaining 410386432), total 4312604672)
2018-05-17 16:55:03 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:04 migration status: active (transferred 2755086633, remaining 408514560), total 4312604672)
2018-05-17 16:55:04 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:05 migration status: active (transferred 2765585795, remaining 395673600), total 4312604672)
2018-05-17 16:55:05 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:06 migration status: active (transferred 2768350063, remaining 391884800), total 4312604672)
2018-05-17 16:55:06 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:07 migration status: active (transferred 2777633182, remaining 378961920), total 4312604672)
2018-05-17 16:55:07 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:08 migration status: active (transferred 2783347046, remaining 370905088), total 4312604672)
2018-05-17 16:55:08 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:09 migration status: active (transferred 2786682102, remaining 366399488), total 4312604672)
2018-05-17 16:55:09 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:10 migration status: active (transferred 2794148734, remaining 356429824), total 4312604672)
2018-05-17 16:55:10 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:11 migration status: active (transferred 2800129745, remaining 347930624), total 4312604672)
2018-05-17 16:55:11 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:12 migration status: active (transferred 2804878207, remaining 341270528), total 4312604672)
2018-05-17 16:55:12 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:13 migration status: active (transferred 2809437227, remaining 335101952), total 4312604672)
2018-05-17 16:55:13 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:14 migration status: active (transferred 2817162950, remaining 324632576), total 4312604672)
2018-05-17 16:55:14 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:15 migration status: active (transferred 2823037068, remaining 316325888), total 4312604672)
2018-05-17 16:55:15 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:16 migration status: active (transferred 2829206314, remaining 307888128), total 4312604672)
2018-05-17 16:55:16 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:17 migration status: active (transferred 2841134945, remaining 291180544), total 4312604672)
2018-05-17 16:55:17 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:19 migration status: active (transferred 2850442606, remaining 278274048), total 4312604672)
2018-05-17 16:55:19 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:20 migration status: active (transferred 2854878713, remaining 272134144), total 4312604672)
2018-05-17 16:55:20 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:21 migration status: active (transferred 2858008722, remaining 267763712), total 4312604672)
2018-05-17 16:55:21 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:22 migration status: active (transferred 2863767793, remaining 259633152), total 4312604672)
2018-05-17 16:55:22 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:23 migration status: active (transferred 2872640449, remaining 247148544), total 4312604672)
2018-05-17 16:55:23 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:24 migration status: active (transferred 2877248781, remaining 240898048), total 4312604672)
2018-05-17 16:55:24 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:25 migration status: active (transferred 2882021542, remaining 234364928), total 4312604672)
2018-05-17 16:55:25 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:26 migration status: active (transferred 2886285660, remaining 228220928), total 4312604672)
2018-05-17 16:55:26 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:27 migration status: active (transferred 2898246991, remaining 211529728), total 4312604672)
2018-05-17 16:55:27 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:28 migration status: active (transferred 2901319643, remaining 207171584), total 4312604672)
2018-05-17 16:55:28 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:29 migration status: active (transferred 2906108648, remaining 200704000), total 4312604672)
2018-05-17 16:55:29 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:30 migration status: active (transferred 2909045661, remaining 196575232), total 4312604672)
2018-05-17 16:55:30 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:31 migration status: active (transferred 2912146771, remaining 192311296), total 4312604672)
2018-05-17 16:55:31 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:32 migration status: active (transferred 2915190165, remaining 188219392), total 4312604672)
2018-05-17 16:55:32 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:33 migration status: active (transferred 2921347062, remaining 179814400), total 4312604672)
2018-05-17 16:55:33 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:34 migration status: active (transferred 2927413203, remaining 171712512), total 4312604672)
2018-05-17 16:55:34 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:35 migration status: active (transferred 2930641708, remaining 167243776), total 4312604672)
2018-05-17 16:55:35 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:36 migration status: active (transferred 2939567959, remaining 154595328), total 4312604672)
2018-05-17 16:55:36 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:37 migration status: active (transferred 2941227783, remaining 152121344), total 4312604672)
2018-05-17 16:55:37 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:38 migration status: active (transferred 2950079748, remaining 139735040), total 4312604672)
2018-05-17 16:55:38 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:39 migration status: active (transferred 2957456821, remaining 129523712), total 4312604672)
2018-05-17 16:55:39 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:40 migration status: active (transferred 2963753183, remaining 121008128), total 4312604672)
2018-05-17 16:55:40 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:41 migration status: active (transferred 2966727195, remaining 116813824), total 4312604672)
2018-05-17 16:55:41 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:42 migration status: active (transferred 2971196323, remaining 110555136), total 4312604672)
2018-05-17 16:55:42 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:43 migration status: active (transferred 2980130108, remaining 98201600), total 4312604672)
2018-05-17 16:55:43 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:44 migration status: active (transferred 2983244152, remaining 93638656), total 4312604672)
2018-05-17 16:55:44 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:45 migration status: active (transferred 2986164525, remaining 89624576), total 4312604672)
2018-05-17 16:55:45 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:46 migration status: active (transferred 2997924939, remaining 73056256), total 4312604672)
2018-05-17 16:55:46 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:47 migration status: active (transferred 3001079582, remaining 68653056), total 4312604672)
2018-05-17 16:55:47 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:48 migration status: active (transferred 3005388978, remaining 62406656), total 4312604672)
2018-05-17 16:55:48 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:49 migration status: active (transferred 3009874550, remaining 56115200), total 4312604672)
2018-05-17 16:55:49 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:50 migration status: active (transferred 3016002414, remaining 47874048), total 4312604672)
2018-05-17 16:55:50 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:51 migration status: active (transferred 3022237252, remaining 39403520), total 4312604672)
2018-05-17 16:55:51 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 0 overflow 0
2018-05-17 16:55:52 migration status: active (transferred 3110872417, remaining 370089984), total 4312604672)
2018-05-17 16:55:52 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 17356 overflow 0
2018-05-17 16:55:53 migration status: active (transferred 3316878676, remaining 141357056), total 4312604672)
2018-05-17 16:55:53 migration xbzrle cachesize: 536870912 transferred 0 pages 0 cachemiss 67540 overflow 0
2018-05-17 16:55:54 migration speed: 32.25 MB/s - downtime 31 ms
2018-05-17 16:55:54 migration status: completed
2018-05-17 16:55:56 ERROR: tunnel replied 'ERR: resume failed - VM 130 not running' to command 'resume 130'
2018-05-17 16:56:24 ERROR: migration finished with problems (duration 00:02:42)
TASK ERROR: migration problems
 
Thank you for your response
I was migrating the VMs before I attempt to update and all of my nodes were running on the same version. I was going to free one node to upgrade it.
It happens completely randomly, VMs suddenly dies at the end of migration.

OK, you have done everything correctly.

There is another interesting thing in the log is the average migration speed was "32.25 MB/s" which is quite slow, all of my servers are connected to each other and to the storage via 10Gbps Fiber and the migration speed is usually more than 100 MB/s. Could this be because of network load or something like that ?
Here is the full task log

Hmm, looks weird... That the migration was so slow is strange (network congestions), but should really not lead to a dead VM on finish...
Can you also please post your pveversion -v output, or tell me at leas which kernel and qemu version you used at the time this happened, those are the normal possible culprits and I could try to reproduce this...
 
This is the output of pveversion -v now, I don't know about before update

proxmox-ve: 5.2-2 (running kernel: 4.15.17-1-pve)
pve-manager: 5.2-1 (running version: 5.2-1/0fcd7879)
pve-kernel-4.15: 5.2-1
pve-kernel-4.13: 5.1-44
pve-kernel-4.15.17-1-pve: 4.15.17-9
pve-kernel-4.13.16-2-pve: 4.13.16-48
pve-kernel-4.13.13-6-pve: 4.13.13-42
pve-kernel-4.13.13-5-pve: 4.13.13-38
pve-kernel-4.13.13-4-pve: 4.13.13-35
pve-kernel-4.13.4-1-pve: 4.13.4-26
corosync: 2.4.2-pve5
criu: 2.11.1-1~bpo90
glusterfs-client: 3.8.8-1
ksm-control-daemon: 1.2-2
libjs-extjs: 6.0.1-2
libpve-access-control: 5.0-8
libpve-apiclient-perl: 2.0-4
libpve-common-perl: 5.0-31
libpve-guest-common-perl: 2.0-16
libpve-http-server-perl: 2.0-8
libpve-storage-perl: 5.0-23
libqb0: 1.0.1-1
lvm2: 2.02.168-pve6
lxc-pve: 3.0.0-3
lxcfs: 3.0.0-1
novnc-pve: 0.6-4
proxmox-widget-toolkit: 1.0-18
pve-cluster: 5.0-27
pve-container: 2.0-23
pve-docs: 5.2-4
pve-firewall: 3.0-8
pve-firmware: 2.0-4
pve-ha-manager: 2.0-5
pve-i18n: 1.0-5
pve-libspice-server1: 0.12.8-3
pve-qemu-kvm: 2.11.1-5
pve-xtermjs: 1.0-5
qemu-server: 5.0-26
smartmontools: 6.5+svn4324-1
spiceterm: 3.0-5
vncterm: 1.5-3
zfsutils-linux: 0.7.8-pve1~bpo9
 
Ok guys Today I was live migrating the VMs again, out of each 10 VMs, 7 of them ended up dead with the following error.
Our Live migration seems not be useful anymore.

2018-06-19 17:29:04 migration status: active (transferred 5753698414, remaining 89804800), total 8607571968)
2018-06-19 17:29:04 migration xbzrle cachesize: 1073741824 transferred 0 pages 0 cachemiss 0 overflow 0
2018-06-19 17:29:05 migration status: active (transferred 5860837536, remaining 243789824), total 8607571968)
2018-06-19 17:29:05 migration xbzrle cachesize: 1073741824 transferred 0 pages 0 cachemiss 15929 overflow 0
2018-06-19 17:29:06 migration speed: 115.38 MB/s - downtime 336 ms
2018-06-19 17:29:06 migration status: completed
2018-06-19 17:29:06 ERROR: tunnel replied 'ERR: resume failed - VM 146 not running' to command 'resume 146'
2018-06-19 17:29:10 ERROR: migration finished with problems (duration 00:01:22)
TASK ERROR: migration problems
 
Hello,

We expect the same, but only from older version to newer version. We installed a new node its up to date, and i can not migrate to this node from older nodes.

New node:
Code:
proxmox-ve: 5.2-2 (running kernel: 4.15.17-1-pve)
pve-manager: 5.2-3 (running version: 5.2-3/785ba980)
pve-kernel-4.15: 5.2-3
pve-kernel-4.15.17-3-pve: 4.15.17-13
pve-kernel-4.15.17-1-pve: 4.15.17-8
corosync: 2.4.2-pve5
criu: 2.11.1-1~bpo90
glusterfs-client: 3.8.8-1
ksm-control-daemon: not correctly installed
libjs-extjs: 6.0.1-2
libpve-access-control: 5.0-8
libpve-apiclient-perl: 2.0-4
libpve-common-perl: 5.0-34
libpve-guest-common-perl: 2.0-17
libpve-http-server-perl: 2.0-9
libpve-storage-perl: 5.0-23
libqb0: 1.0.1-1
lvm2: 2.02.168-pve6
lxc-pve: 3.0.0-3
lxcfs: 3.0.0-1
novnc-pve: 1.0.0-1
openvswitch-switch: 2.7.0-2
proxmox-widget-toolkit: 1.0-19
pve-cluster: 5.0-27
pve-container: 2.0-23
pve-docs: 5.2-4
pve-firewall: 3.0-12
pve-firmware: 2.0-4
pve-ha-manager: 2.0-5
pve-i18n: 1.0-6
pve-libspice-server1: 0.12.8-3
pve-qemu-kvm: 2.11.1-5
pve-xtermjs: 1.0-5
qemu-server: 5.0-29
smartmontools: 6.5+svn4324-1
spiceterm: 3.0-5
vncterm: 1.5-3
Older node:
Code:
proxmox-ve: 5.2-2 (running kernel: 4.15.17-1-pve)
pve-manager: 5.2-1 (running version: 5.2-1/0fcd7879)
pve-kernel-4.15: 5.2-1
pve-kernel-4.15.17-1-pve: 4.15.17-9
pve-kernel-4.15.15-1-pve: 4.15.15-6
corosync: 2.4.2-pve5
criu: 2.11.1-1~bpo90
glusterfs-client: 3.8.8-1
ksm-control-daemon: not correctly installed
libjs-extjs: 6.0.1-2
libpve-access-control: 5.0-8
libpve-apiclient-perl: 2.0-4
libpve-common-perl: 5.0-31
libpve-guest-common-perl: 2.0-16
libpve-http-server-perl: 2.0-8
libpve-storage-perl: 5.0-23
libqb0: 1.0.1-1
lvm2: 2.02.168-pve6
lxc-pve: 3.0.0-3
lxcfs: 3.0.0-1
novnc-pve: 0.6-4
openvswitch-switch: 2.7.0-2
proxmox-widget-toolkit: 1.0-18
pve-cluster: 5.0-27
pve-container: 2.0-23
pve-docs: 5.2-4
pve-firewall: 3.0-8
pve-firmware: 2.0-4
pve-ha-manager: 2.0-5
pve-i18n: 1.0-5
pve-libspice-server1: 0.12.8-3
pve-qemu-kvm: 2.11.1-5
pve-xtermjs: 1.0-5
qemu-server: 5.0-26
smartmontools: 6.5+svn4324-1
spiceterm: 3.0-5
vncterm: 1.5-3
xm8 ~ #

The strange is, if i start VM on the new node, migrate back to the old one, and after i can migrate it to the new without issue.
 
Hello everyone,

We have encountered the same issue. We have a cluster of 8 nodes and several days ago a new node was installed and added to the cluster.

versions output from older node which was updated:


proxmox-ve: 5.2-2 (running kernel: 4.15.17-3-pve)
pve-manager: 5.2-5 (running version: 5.2-5/eb24855a)
pve-kernel-4.15: 5.2-3
pve-kernel-4.15.17-3-pve: 4.15.17-14
pve-kernel-4.13.13-5-pve: 4.13.13-38
pve-kernel-4.13.4-1-pve: 4.13.4-26
corosync: 2.4.2-pve5
criu: 2.11.1-1~bpo90
glusterfs-client: 3.8.8-1
ksm-control-daemon: 1.2-2
libjs-extjs: 6.0.1-2
libpve-access-control: 5.0-8
libpve-apiclient-perl: 2.0-5
libpve-common-perl: 5.0-35
libpve-guest-common-perl: 2.0-17
libpve-http-server-perl: 2.0-9
libpve-storage-perl: 5.0-23
libqb0: 1.0.1-1
lvm2: 2.02.168-pve6
lxc-pve: 3.0.0-3
lxcfs: 3.0.0-1
novnc-pve: 1.0.0-1
proxmox-widget-toolkit: 1.0-19
pve-cluster: 5.0-27
pve-container: 2.0-24
pve-docs: 5.2-4
pve-firewall: 3.0-12
pve-firmware: 2.0-4
pve-ha-manager: 2.0-5
pve-i18n: 1.0-6
pve-libspice-server1: 0.12.8-3
pve-qemu-kvm: 2.11.1-5
pve-xtermjs: 1.0-5
qemu-server: 5.0-29
smartmontools: 6.5+svn4324-1
spiceterm: 3.0-5
vncterm: 1.5-3
zfsutils-linux: 0.7.9-pve1~bpo9

versions output from the newly installed node:


proxmox-ve: 5.2-2 (running kernel: 4.15.17-3-pve)
pve-manager: 5.2-5 (running version: 5.2-5/eb24855a)
pve-kernel-4.15: 5.2-3
pve-kernel-4.15.17-3-pve: 4.15.17-14
pve-kernel-4.13.13-2-pve: 4.13.13-33
corosync: 2.4.2-pve5
criu: 2.11.1-1~bpo90
glusterfs-client: 3.8.8-1
ksm-control-daemon: 1.2-2
libjs-extjs: 6.0.1-2
libpve-access-control: 5.0-8
libpve-apiclient-perl: 2.0-5
libpve-common-perl: 5.0-35
libpve-guest-common-perl: 2.0-17
libpve-http-server-perl: 2.0-9
libpve-storage-perl: 5.0-23
libqb0: 1.0.1-1
lvm2: 2.02.168-pve6
lxc-pve: 3.0.0-3
lxcfs: 3.0.0-1
novnc-pve: 1.0.0-1
proxmox-widget-toolkit: 1.0-19
pve-cluster: 5.0-27
pve-container: 2.0-24
pve-docs: 5.2-4
pve-firewall: 3.0-12
pve-firmware: 2.0-4
pve-ha-manager: 2.0-5
pve-i18n: 1.0-6
pve-libspice-server1: 0.12.8-3
pve-qemu-kvm: 2.11.1-5
pve-xtermjs: 1.0-5
qemu-server: 5.0-29
smartmontools: 6.5+svn4324-1
spiceterm: 3.0-5
vncterm: 1.5-3
zfsutils-linux: 0.7.9-pve1~bpo9
 
Now i updated one node to the latest version, i tried to migrate from older node to the upgraded, and i got the same error. But i not see the reason, or any solution or even a workaround.
 
@t.lamprecht
Are there any updates ?
We have the same problem.

All nodes have the same versions - we do not plan to update.

Code:
proxmox-ve: 5.2-2 (running kernel: 4.15.18-7-pve)
pve-manager: 5.2-9 (running version: 5.2-9/4b30e8f9)
pve-kernel-4.15: 5.2-10
pve-kernel-4.15.18-7-pve: 4.15.18-27
pve-kernel-4.15.18-5-pve: 4.15.18-24
pve-kernel-4.15.18-3-pve: 4.15.18-22
pve-kernel-4.15.17-1-pve: 4.15.17-9
ceph: 12.2.8-pve1
corosync: 2.4.2-pve5
criu: 2.11.1-1~bpo90
glusterfs-client: 3.8.8-1
ksm-control-daemon: 1.2-2
libjs-extjs: 6.0.1-2
libpve-access-control: 5.0-8
libpve-apiclient-perl: 2.0-5
libpve-common-perl: 5.0-40
libpve-guest-common-perl: 2.0-18
libpve-http-server-perl: 2.0-11
libpve-storage-perl: 5.0-30
libqb0: 1.0.1-1
lvm2: 2.02.168-pve6
lxc-pve: 3.0.2+pve1-2
lxcfs: 3.0.2-2
novnc-pve: 1.0.0-2
proxmox-widget-toolkit: 1.0-20
pve-cluster: 5.0-30
pve-container: 2.0-28
pve-docs: 5.2-8
pve-firewall: 3.0-14
pve-firmware: 2.0-5
pve-ha-manager: 2.0-5
pve-i18n: 1.0-6
pve-libspice-server1: 0.12.8-3
pve-qemu-kvm: 2.11.2-1
pve-xtermjs: 1.0-5
qemu-server: 5.0-36
smartmontools: 6.5+svn4324-1
spiceterm: 3.0-5
vncterm: 1.5-3
zfsutils-linux: 0.7.11-pve1~bpo1
 
All nodes have the same versions - we do not plan to update

We only support the last rolled out versions, and cannot inject fixes in older package versions, so if fixes and security updates are desired updating is recommended.

We could not really reproduce this here yet, I'm afraid.
Can you post a VM configuration of an affected VM. And logs from journal/syslog from target and source around the time of migrations.
 
@t.lamprecht
Config:
Code:
boot: cn
bootdisk: scsi0
cores: 8
cpu: host
memory: 65536
name: mdarchive-simu
net0: virtio=E6:15:AA:FA:XX:XX,bridge=vmbr112
net1: virtio=C6:56:F1:35:XX:XX,bridge=vmbr10,queues=16
numa: 1
ostype: l26
scsi0: NVME:vm-8120-disk-0,size=200G
scsi1: HDD:vm-8120-disk-0,size=1200G
scsihw: virtio-scsi-pci
smbios1: uuid=2f0052da-33f2-4de0-b52e-82b15c5aa33c
sockets: 2
vmgenid: f8ddc0dc-8563-4367-a290-33edd8571e7d

Migation Error:
Code:
2019-02-04 16:43:27 use dedicated network address for sending migration traffic (192.168.24.26)
2019-02-04 16:43:27 starting migration of VM 8120 to node 'sv18006' (192.168.24.26)
2019-02-04 16:43:27 copying disk images
2019-02-04 16:43:27 starting VM 8120 on remote node 'sv18006'
2019-02-04 16:43:30 start remote tunnel
2019-02-04 16:43:31 ssh tunnel ver 1
2019-02-04 16:43:31 starting online/live migration on tcp:192.168.24.26:60000
2019-02-04 16:43:31 migrate_set_speed: 8589934592
2019-02-04 16:43:31 migrate_set_downtime: 0.1
2019-02-04 16:43:31 set migration_caps
2019-02-04 16:43:31 set cachesize: 8589934592
2019-02-04 16:43:31 start migrate command to tcp:192.168.24.26:60000
2019-02-04 16:43:32 migration status: active (transferred 147389540, remaining 66060132352), total 68737376256)
2019-02-04 16:43:32 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 0 overflow 0
2019-02-04 16:43:33 migration status: active (transferred 807991392, remaining 63748997120), total 68737376256)
2019-02-04 16:43:33 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 0 overflow 0
2019-02-04 16:43:34 migration status: active (transferred 1649702891, remaining 61616726016), total 68737376256)
2019-02-04 16:43:34 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 0 overflow 0
2019-02-04 16:43:35 migration status: active (transferred 2407340818, remaining 59428143104), total 68737376256)
2019-02-04 16:43:35 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 0 overflow 0
2019-02-04 16:43:36 migration status: active (transferred 3179591243, remaining 57203458048), total 68737376256)
2019-02-04 16:43:36 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 0 overflow 0
2019-02-04 16:43:37 migration status: active (transferred 3915683638, remaining 55034318848), total 68737376256)
2019-02-04 16:43:37 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 0 overflow 0
2019-02-04 16:43:38 migration status: active (transferred 4678600441, remaining 52832522240), total 68737376256)
2019-02-04 16:43:38 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 0 overflow 0
2019-02-04 16:43:39 migration status: active (transferred 5443008330, remaining 50626752512), total 68737376256)
XXXXXXXXXXXXXXXXXXXx
2019-02-04 16:44:14 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 283740 overflow 0
2019-02-04 16:44:15 migration status: active (transferred 36210839985, remaining 3715715072), total 68737376256)
2019-02-04 16:44:15 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 321284 overflow 0
2019-02-04 16:44:16 migration status: active (transferred 36385017820, remaining 1178808320), total 68737376256)
2019-02-04 16:44:16 migration xbzrle cachesize: 8589934592 transferred 0 pages 0 cachemiss 362457 overflow 0
2019-02-04 16:44:17 migration status: active (transferred 36993136489, remaining 11631550464), total 68737376256)
2019-02-04 16:44:17 migration xbzrle cachesize: 8589934592 transferred 308755 pages 448 cachemiss 510239 overflow 0
2019-02-04 16:44:18 migration status: active (transferred 37032055238, remaining 9091629056), total 68737376256)
XXXXXXXX
2019-02-04 16:44:28 migration xbzrle cachesize: 8589934592 transferred 98521538 pages 492452 cachemiss 1210038 overflow 5416
2019-02-04 16:44:29 migration status: active (transferred 40058290281, remaining 2955366400), total 68737376256)
2019-02-04 16:44:29 migration xbzrle cachesize: 8589934592 transferred 98695154 pages 493415 cachemiss 1215378 overflow 5424
2019-02-04 16:44:29 migrate_set_downtime: 0.2
2019-02-04 16:44:29 migration status: active (transferred 40061613610, remaining 2703360000), total 68737376256)
2019-02-04 16:44:29 migration xbzrle cachesize: 8589934592 transferred 99977751 pages 516104 cachemiss 1215747 overflow 5475
XXXXXXX
2019-02-04 16:44:34 migration status: active (transferred 41639746608, remaining 843853824), total 68737376256)
2019-02-04 16:44:34 migration xbzrle cachesize: 8589934592 transferred 154059526 pages 1104123 cachemiss 1582281 overflow 7249
2019-02-04 16:44:34 migrate_set_downtime: 0.4
2019-02-04 16:44:34 migration status: active (transferred 41712824659, remaining 641769472), total 68737376256)
2019-02-04 16:44:34 migration xbzrle cachesize: 8589934592 transferred 155919011 pages 1132028 cachemiss 1599627 overflow 7252
XXXXXXXX
2019-02-04 16:44:36 migration xbzrle cachesize: 8589934592 transferred 179653558 pages 1444310 cachemiss 1726102 overflow 7723
2019-02-04 16:44:36 migrate_set_downtime: 0.8
2019-02-04 16:44:36 migration status: active (transferred 42293663830, remaining 141410304), total 68737376256)
2019-02-04 16:44:36 migration xbzrle cachesize: 8589934592 transferred 181190902 pages 1460966 cachemiss 1733734 overflow 7734
2019-02-04 16:44:37 migration speed: 992.97 MB/s - downtime 137 ms
2019-02-04 16:44:37 migration status: completed
2019-02-04 16:44:37 ERROR: tunnel replied 'ERR: resume failed - VM 8120 not running' to command 'resume 8120'
2019-02-04 16:44:46 ERROR: migration finished with problems (duration 00:01:20)
TASK ERROR: migration problems
 
Destination:
Code:
Feb 04 16:05:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:05:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:05:01 sv18006 CRON[2553247]: pam_unix(cron:session): session opened for user root by (uid=0)
Feb 04 16:05:01 sv18006 CRON[2553248]: (root) CMD (command -v debian-sa1 > /dev/null && debian-sa1 1 1)
Feb 04 16:05:01 sv18006 CRON[2553247]: pam_unix(cron:session): session closed for user root
Feb 04 16:05:04 sv18006 sshd[2553280]: Connection closed by 10.15.70.16 port 38096 [preauth]
Feb 04 16:05:30 sv18006 pvedaemon[2543417]: <pistorit@pam> starting task UPID:sv18006:0026F613:2AC2DB6A:5C5854BA:qmigrate:4051:pistorit@pam:
Feb 04 16:05:32 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:05:34 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:05:59 sv18006 kernel: vmbr112: port 4(tap4051i0) entered disabled state
Feb 04 16:05:59 sv18006 kernel: vmbr197: port 2(tap4051i1) entered disabled state
Feb 04 16:06:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:06:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:06:01 sv18006 pvedaemon[2543417]: <pistorit@pam> end task UPID:sv18006:0026F613:2AC2DB6A:5C5854BA:qmigrate:4051:pistorit@pam: OK
Feb 04 16:06:04 sv18006 sshd[2553865]: Connection closed by 10.15.70.16 port 39588 [preauth]
Feb 04 16:06:26 sv18006 pvedaemon[2496742]: <pistorit@pam> starting task UPID:sv18006:0026F867:2AC2F178:5C5854F2:qmigrate:4053:pistorit@pam:
Feb 04 16:06:28 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:06:32 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:06:56 sv18006 kernel: vmbr112: port 7(tap4053i0) entered disabled state
Feb 04 16:06:56 sv18006 kernel: vmbr197: port 4(tap4053i1) entered disabled state
Feb 04 16:06:58 sv18006 pvedaemon[2496742]: <pistorit@pam> end task UPID:sv18006:0026F867:2AC2F178:5C5854F2:qmigrate:4053:pistorit@pam: OK
Feb 04 16:07:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:07:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:07:04 sv18006 sshd[2554478]: Connection closed by 10.15.70.16 port 41100 [preauth]
Feb 04 16:08:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:08:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:08:04 sv18006 sshd[2555052]: Connection closed by 10.15.70.16 port 42593 [preauth]
Feb 04 16:08:06 sv18006 pvedaemon[2496742]: worker exit
Feb 04 16:08:06 sv18006 pvedaemon[4434]: worker 2496742 finished
Feb 04 16:08:06 sv18006 pvedaemon[4434]: starting 1 worker(s)
Feb 04 16:08:06 sv18006 pvedaemon[4434]: worker 2555059 started
Feb 04 16:08:40 sv18006 sshd[2540207]: Received disconnect from 10.14.1.24 port 40074:11: disconnected by user
Feb 04 16:08:40 sv18006 sshd[2540207]: Disconnected from 10.14.1.24 port 40074
Feb 04 16:08:40 sv18006 sshd[2540207]: pam_unix(sshd:session): session closed for user root
Feb 04 16:08:40 sv18006 systemd-logind[1594]: Removed session 19459.
Feb 04 16:08:40 sv18006 systemd[1]: Stopping User Manager for UID 0...
Feb 04 16:08:40 sv18006 systemd[2540209]: Stopped target Default.
Feb 04 16:08:40 sv18006 systemd[2540209]: Stopped target Basic System.
Feb 04 16:08:40 sv18006 systemd[2540209]: Stopped target Sockets.
Feb 04 16:08:40 sv18006 systemd[2540209]: Closed GnuPG cryptographic agent (ssh-agent emulation).
Feb 04 16:08:40 sv18006 systemd[2540209]: Closed GnuPG cryptographic agent and passphrase cache.
Feb 04 16:08:40 sv18006 systemd[2540209]: Stopped target Paths.
Feb 04 16:08:40 sv18006 systemd[2540209]: Stopped target Timers.
Feb 04 16:08:40 sv18006 systemd[2540209]: Closed GnuPG cryptographic agent and passphrase cache (restricted).
Feb 04 16:08:40 sv18006 systemd[2540209]: Closed GnuPG cryptographic agent (access for web browsers).
Feb 04 16:08:40 sv18006 systemd[2540209]: Reached target Shutdown.
Feb 04 16:08:40 sv18006 systemd[2540209]: Starting Exit the Session...
Feb 04 16:08:40 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:08:40 sv18006 systemd[2540209]: Received SIGRTMIN+24 from PID 2555173 (kill).
Feb 04 16:08:40 sv18006 systemd[2540210]: pam_unix(systemd-user:session): session closed for user root
Feb 04 16:08:40 sv18006 systemd[1]: Stopped User Manager for UID 0.
Feb 04 16:08:40 sv18006 systemd[1]: Removed slice User Slice of root.
Feb 04 16:08:40 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:08:41 sv18006 sshd[2555194]: Accepted publickey for root from 10.14.1.24 port 42402 ssh2: RSA SHA256:GIzSn59VXXXXXXXX/fCeYc4
Feb 04 16:08:41 sv18006 sshd[2555194]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:08:41 sv18006 systemd[1]: Created slice User Slice of root.
Feb 04 16:08:41 sv18006 systemd[1]: Starting User Manager for UID 0...
Feb 04 16:08:41 sv18006 systemd-logind[1594]: New session 19464 of user root.
Feb 04 16:08:41 sv18006 systemd[1]: Started Session 19464 of user root.
Feb 04 16:08:41 sv18006 systemd[2555196]: pam_unix(systemd-user:session): session opened for user root by (uid=0)
Feb 04 16:08:41 sv18006 systemd[2555196]: Listening on GnuPG cryptographic agent and passphrase cache.
Feb 04 16:08:41 sv18006 systemd[2555196]: Listening on GnuPG cryptographic agent and passphrase cache (restricted).
Feb 04 16:08:41 sv18006 systemd[2555196]: Listening on GnuPG cryptographic agent (access for web browsers).
Feb 04 16:08:41 sv18006 systemd[2555196]: Listening on GnuPG cryptographic agent (ssh-agent emulation).
Feb 04 16:08:41 sv18006 systemd[2555196]: Reached target Sockets.
Feb 04 16:08:41 sv18006 systemd[2555196]: Reached target Paths.
Feb 04 16:08:41 sv18006 systemd[2555196]: Reached target Timers.
Feb 04 16:08:41 sv18006 systemd[2555196]: Reached target Basic System.
Feb 04 16:08:41 sv18006 systemd[2555196]: Reached target Default.
Feb 04 16:08:41 sv18006 systemd[2555196]: Startup finished in 24ms.
Feb 04 16:08:41 sv18006 systemd[1]: Started User Manager for UID 0.
Feb 04 16:09:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:09:00 sv18006 pvesr[2555487]: trying to acquire cfs lock 'file-replication_cfg' ...
Feb 04 16:09:01 sv18006 pvesr[2555487]: trying to acquire cfs lock 'file-replication_cfg' ...
Feb 04 16:09:02 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:09:04 sv18006 sshd[2555646]: Connection closed by 10.15.70.16 port 44088 [preauth]
Feb 04 16:10:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:10:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:10:04 sv18006 sshd[2556182]: Connection closed by 10.15.70.16 port 45585 [preauth]
Feb 04 16:11:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:11:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:11:04 sv18006 sshd[2556722]: Connection closed by 10.15.70.16 port 47081 [preauth]
Feb 04 16:12:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:12:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:12:04 sv18006 sshd[2557047]: Connection closed by 10.15.70.16 port 48585 [preauth]
Feb 04 16:13:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:13:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:13:04 sv18006 sshd[2557489]: Connection closed by 10.15.70.16 port 50085 [preauth]
Feb 04 16:13:11 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:14:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:14:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:14:04 sv18006 sshd[2558029]: Connection closed by 10.15.70.16 port 51581 [preauth]
Feb 04 16:14:42 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:15:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:15:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:15:01 sv18006 CRON[2558529]: pam_unix(cron:session): session opened for user root by (uid=0)
Feb 04 16:15:01 sv18006 CRON[2558530]: (root) CMD (command -v debian-sa1 > /dev/null && debian-sa1 1 1)
Feb 04 16:15:01 sv18006 CRON[2558529]: pam_unix(cron:session): session closed for user root
Feb 04 16:15:04 sv18006 sshd[2558562]: Connection closed by 10.15.70.16 port 53089 [preauth]
Feb 04 16:16:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:16:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:16:04 sv18006 sshd[2559097]: Connection closed by 10.15.70.16 port 54580 [preauth]
Feb 04 16:16:16 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:16:53 sv18006 rrdcached[3967]: flushing old values
Feb 04 16:16:53 sv18006 rrdcached[3967]: rotating journals
Feb 04 16:16:53 sv18006 rrdcached[3967]: started new journal /var/lib/rrdcached/journal/rrd.journal.1549293413.446348
Feb 04 16:16:53 sv18006 rrdcached[3967]: removing old journal /var/lib/rrdcached/journal/rrd.journal.1549286213.446381
Feb 04 16:17:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:17:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:17:01 sv18006 CRON[2560595]: pam_unix(cron:session): session opened for user root by (uid=0)
Feb 04 16:17:01 sv18006 CRON[2560596]: (root) CMD ( cd / && run-parts --report /etc/cron.hourly)
Feb 04 16:17:01 sv18006 CRON[2560595]: pam_unix(cron:session): session closed for user root
Feb 04 16:17:01 sv18006 puppet-agent[2559549]: xxx
Feb 04 16:17:01 sv18006 puppet-agent[2559549]: xxx '
Feb 04 16:17:04 sv18006 sshd[2560884]: Connection closed by 10.15.70.16 port 56082 [preauth]
Feb 04 16:17:04 sv18006 puppet-agent[2559549]: Finished catalog run in 3.77 seconds
Feb 04 16:18:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:18:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:18:04 sv18006 sshd[2561420]: Connection closed by 10.15.70.16 port 57594 [preauth]
Feb 04 16:19:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:19:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:19:04 sv18006 sshd[2561954]: Connection closed by 10.15.70.16 port 59088 [preauth]
Feb 04 16:19:08 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:20:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:20:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:20:04 sv18006 sshd[2562494]: Connection closed by 10.15.70.16 port 60590 [preauth]
Feb 04 16:21:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:21:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:21:04 sv18006 sshd[2563023]: Connection closed by 10.15.70.16 port 33857 [preauth]
Feb 04 16:22:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:22:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:22:04 sv18006 sshd[2563558]: Connection closed by 10.15.70.16 port 35348 [preauth]
Feb 04 16:23:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:23:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:23:04 sv18006 sshd[2564100]: Connection closed by 10.15.70.16 port 36854 [preauth]
Feb 04 16:23:42 sv18006 sshd[2555194]: Received disconnect from 10.14.1.24 port 42402:11: disconnected by user
Feb 04 16:23:42 sv18006 sshd[2555194]: Disconnected from 10.14.1.24 port 42402
Feb 04 16:23:42 sv18006 sshd[2555194]: pam_unix(sshd:session): session closed for user root
Feb 04 16:23:42 sv18006 systemd-logind[1594]: Removed session 19464.
Feb 04 16:23:42 sv18006 systemd[1]: Stopping User Manager for UID 0...
Feb 04 16:23:42 sv18006 systemd[2555196]: Stopped target Default.
Feb 04 16:23:42 sv18006 systemd[2555196]: Stopped target Basic System.
Feb 04 16:23:42 sv18006 systemd[2555196]: Stopped target Sockets.
Feb 04 16:23:42 sv18006 systemd[2555196]: Closed GnuPG cryptographic agent (ssh-agent emulation).
Feb 04 16:23:42 sv18006 systemd[2555196]: Closed GnuPG cryptographic agent and passphrase cache (restricted).
Feb 04 16:23:42 sv18006 systemd[2555196]: Closed GnuPG cryptographic agent (access for web browsers).
Feb 04 16:23:42 sv18006 systemd[2555196]: Closed GnuPG cryptographic agent and passphrase cache.
Feb 04 16:23:42 sv18006 systemd[2555196]: Reached target Shutdown.
Feb 04 16:23:42 sv18006 systemd[2555196]: Starting Exit the Session...
Feb 04 16:23:42 sv18006 systemd[2555196]: Stopped target Timers.
Feb 04 16:23:42 sv18006 systemd[2555196]: Stopped target Paths.
Feb 04 16:23:42 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:23:42 sv18006 systemd[2555196]: Received SIGRTMIN+24 from PID 2564513 (kill).
Feb 04 16:23:42 sv18006 systemd[2555197]: pam_unix(systemd-user:session): session closed for user root
Feb 04 16:23:42 sv18006 systemd[1]: Stopped User Manager for UID 0.
Feb 04 16:23:42 sv18006 systemd[1]: Removed slice User Slice of root.
Feb 04 16:23:42 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:23:43 sv18006 sshd[2564530]: Accepted publickey for root from 10.14.1.24 port 43824 ssh2: RSA SHA256:GIzSn59xxxxxxx
Feb 04 16:23:43 sv18006 sshd[2564530]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:23:43 sv18006 systemd[1]: Created slice User Slice of root.
Feb 04 16:23:43 sv18006 systemd[1]: Starting User Manager for UID 0...
Feb 04 16:23:43 sv18006 systemd-logind[1594]: New session 19468 of user root.
Feb 04 16:23:43 sv18006 systemd[1]: Started Session 19468 of user root.
Feb 04 16:23:43 sv18006 systemd[2564532]: pam_unix(systemd-user:session): session opened for user root by (uid=0)
Feb 04 16:23:43 sv18006 systemd[2564532]: Reached target Timers.
Feb 04 16:23:43 sv18006 systemd[2564532]: Listening on GnuPG cryptographic agent and passphrase cache (restricted).
Feb 04 16:23:43 sv18006 systemd[2564532]: Listening on GnuPG cryptographic agent and passphrase cache.
Feb 04 16:23:43 sv18006 systemd[2564532]: Reached target Paths.
Feb 04 16:23:43 sv18006 systemd[2564532]: Listening on GnuPG cryptographic agent (access for web browsers).
Feb 04 16:23:43 sv18006 systemd[2564532]: Listening on GnuPG cryptographic agent (ssh-agent emulation).
Feb 04 16:23:43 sv18006 systemd[2564532]: Reached target Sockets.
Feb 04 16:23:43 sv18006 systemd[2564532]: Reached target Basic System.
Feb 04 16:23:43 sv18006 systemd[2564532]: Reached target Default.
Feb 04 16:23:43 sv18006 systemd[2564532]: Startup finished in 25ms.
Feb 04 16:23:43 sv18006 systemd[1]: Started User Manager for UID 0.
Feb 04 16:24:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:24:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:24:04 sv18006 sshd[2564677]: Connection closed by 10.15.70.16 port 38349 [preauth]
Feb 04 16:25:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:25:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:25:01 sv18006 CRON[2565182]: pam_unix(cron:session): session opened for user root by (uid=0)
Feb 04 16:25:01 sv18006 CRON[2565183]: (root) CMD (command -v debian-sa1 > /dev/null && debian-sa1 1 1)
Feb 04 16:25:01 sv18006 CRON[2565182]: pam_unix(cron:session): session closed for user root
Feb 04 16:25:04 sv18006 sshd[2565215]: Connection closed by 10.15.70.16 port 39842 [preauth]
Feb 04 16:26:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:26:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:26:04 sv18006 sshd[2565753]: Connection closed by 10.15.70.16 port 41347 [preauth]
Feb 04 16:26:45 sv18006 pmxcfs[4003]: [dcdb] notice: data verification successful
Feb 04 16:27:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:27:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:27:04 sv18006 sshd[2566287]: Connection closed by 10.15.70.16 port 42861 [preauth]
Feb 04 16:28:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:28:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:28:04 sv18006 sshd[2566819]: Connection closed by 10.15.70.16 port 44356 [preauth]
Feb 04 16:28:11 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:29:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:29:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:29:04 sv18006 sshd[2567357]: Connection closed by 10.15.70.16 port 45870 [preauth]
Feb 04 16:29:43 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:30:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:30:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:30:04 sv18006 sshd[2567894]: Connection closed by 10.15.70.16 port 47380 [preauth]
Feb 04 16:30:08 sv18006 pveproxy[2504588]: worker exit
Feb 04 16:30:08 sv18006 pveproxy[6417]: worker 2504588 finished
Feb 04 16:30:08 sv18006 pveproxy[6417]: starting 1 worker(s)
Feb 04 16:30:08 sv18006 pveproxy[6417]: worker 2567896 started
Feb 04 16:31:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:31:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:31:04 sv18006 sshd[2568429]: Connection closed by 10.15.70.16 port 48859 [preauth]
Feb 04 16:31:17 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:32:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:32:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:32:04 sv18006 sshd[2568967]: Connection closed by 10.15.70.16 port 50357 [preauth]
Feb 04 16:33:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:33:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:33:04 sv18006 sshd[2569505]: Connection closed by 10.15.70.16 port 51852 [preauth]
Feb 04 16:34:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:34:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:34:04 sv18006 sshd[2570035]: Connection closed by 10.15.70.16 port 53353 [preauth]
Feb 04 16:34:08 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:35:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:35:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:35:01 sv18006 CRON[2570544]: pam_unix(cron:session): session opened for user root by (uid=0)
Feb 04 16:35:01 sv18006 CRON[2570545]: (root) CMD (command -v debian-sa1 > /dev/null && debian-sa1 1 1)
Feb 04 16:35:01 sv18006 CRON[2570544]: pam_unix(cron:session): session closed for user root
Feb 04 16:35:04 sv18006 sshd[2570582]: Connection closed by 10.15.70.16 port 54855 [preauth]
Feb 04 16:36:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:36:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:36:04 sv18006 sshd[2571117]: Connection closed by 10.15.70.16 port 56359 [preauth]
Feb 04 16:37:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:37:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:37:04 sv18006 sshd[2571649]: Connection closed by 10.15.70.16 port 57855 [preauth]
Feb 04 16:38:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:38:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:38:04 sv18006 sshd[2572189]: Connection closed by 10.15.70.16 port 59355 [preauth]
Feb 04 16:38:44 sv18006 sshd[2564530]: Received disconnect from 10.14.1.24 port 43824:11: disconnected by user
Feb 04 16:38:44 sv18006 sshd[2564530]: Disconnected from 10.14.1.24 port 43824
Feb 04 16:38:44 sv18006 sshd[2564530]: pam_unix(sshd:session): session closed for user root
Feb 04 16:38:44 sv18006 systemd-logind[1594]: Removed session 19468.
Feb 04 16:38:44 sv18006 systemd[1]: Stopping User Manager for UID 0...
Feb 04 16:38:44 sv18006 systemd[2564532]: Stopped target Default.
Feb 04 16:38:44 sv18006 systemd[2564532]: Stopped target Basic System.
Feb 04 16:38:44 sv18006 systemd[2564532]: Stopped target Paths.
Feb 04 16:38:44 sv18006 systemd[2564532]: Stopped target Timers.
Feb 04 16:38:44 sv18006 systemd[2564532]: Stopped target Sockets.
Feb 04 16:38:44 sv18006 systemd[2564532]: Closed GnuPG cryptographic agent (access for web browsers).
Feb 04 16:38:44 sv18006 systemd[2564532]: Closed GnuPG cryptographic agent and passphrase cache (restricted).
Feb 04 16:38:44 sv18006 systemd[2564532]: Closed GnuPG cryptographic agent and passphrase cache.
Feb 04 16:38:44 sv18006 systemd[2564532]: Closed GnuPG cryptographic agent (ssh-agent emulation).
Feb 04 16:38:44 sv18006 systemd[2564532]: Reached target Shutdown.
Feb 04 16:38:44 sv18006 systemd[2564532]: Starting Exit the Session...
Feb 04 16:38:44 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:38:44 sv18006 systemd[2564532]: Received SIGRTMIN+24 from PID 2572334 (kill).
Feb 04 16:38:44 sv18006 systemd[2564533]: pam_unix(systemd-user:session): session closed for user root
Feb 04 16:38:44 sv18006 systemd[1]: Stopped User Manager for UID 0.
Feb 04 16:38:44 sv18006 systemd[1]: Removed slice User Slice of root.
Feb 04 16:38:44 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:38:45 sv18006 sshd[2572361]: Accepted publickey for root from 10.14.1.24 port 45202 ssh2: RSA SHA256:GIzSn59V5JZURxxxxxxxx
Feb 04 16:38:45 sv18006 sshd[2572361]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:38:45 sv18006 systemd[1]: Created slice User Slice of root.
Feb 04 16:38:45 sv18006 systemd[1]: Starting User Manager for UID 0...
Feb 04 16:38:45 sv18006 systemd-logind[1594]: New session 19472 of user root.
Feb 04 16:38:45 sv18006 systemd[1]: Started Session 19472 of user root.
Feb 04 16:38:45 sv18006 systemd[2572363]: pam_unix(systemd-user:session): session opened for user root by (uid=0)
Feb 04 16:38:45 sv18006 systemd[2572363]: Listening on GnuPG cryptographic agent and passphrase cache (restricted).
Feb 04 16:38:45 sv18006 systemd[2572363]: Listening on GnuPG cryptographic agent (access for web browsers).
Feb 04 16:38:45 sv18006 systemd[2572363]: Reached target Paths.
Feb 04 16:38:45 sv18006 systemd[2572363]: Reached target Timers.
Feb 04 16:38:45 sv18006 systemd[2572363]: Listening on GnuPG cryptographic agent and passphrase cache.
Feb 04 16:38:45 sv18006 systemd[2572363]: Listening on GnuPG cryptographic agent (ssh-agent emulation).
Feb 04 16:38:45 sv18006 systemd[2572363]: Reached target Sockets.
Feb 04 16:38:45 sv18006 systemd[2572363]: Reached target Basic System.
Feb 04 16:38:45 sv18006 systemd[2572363]: Reached target Default.
Feb 04 16:38:45 sv18006 systemd[2572363]: Startup finished in 23ms.
Feb 04 16:38:45 sv18006 systemd[1]: Started User Manager for UID 0.
Feb 04 16:39:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:39:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:39:04 sv18006 sshd[2572780]: Connection closed by 10.15.70.16 port 60861 [preauth]
Feb 04 16:40:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:40:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:40:04 sv18006 sshd[2573310]: Connection closed by 10.15.70.16 port 34134 [preauth]
Feb 04 16:41:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:41:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:41:04 sv18006 sshd[2573850]: Connection closed by 10.15.70.16 port 35641 [preauth]
Feb 04 16:42:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:42:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:42:04 sv18006 sshd[2574387]: Connection closed by 10.15.70.16 port 37137 [preauth]
Feb 04 16:43:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:43:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:43:04 sv18006 sshd[2574921]: Connection closed by 10.15.70.16 port 38638 [preauth]
Feb 04 16:43:11 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:43:26 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:43:26 sv18006 sshd[2574997]: Accepted publickey for root from 10.14.1.28 port 37156 ssh2: RSA SHA256:4syiHux8xxxxxxx
Feb 04 16:43:26 sv18006 sshd[2574997]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:43:26 sv18006 systemd-logind[1594]: New session 19474 of user root.
Feb 04 16:43:26 sv18006 systemd[1]: Started Session 19474 of user root.
Feb 04 16:43:26 sv18006 sshd[2574997]: Received disconnect from 10.14.1.28 port 37156:11: disconnected by user
Feb 04 16:43:26 sv18006 sshd[2574997]: Disconnected from 10.14.1.28 port 37156
Feb 04 16:43:26 sv18006 sshd[2574997]: pam_unix(sshd:session): session closed for user root
Feb 04 16:43:26 sv18006 systemd-logind[1594]: Removed session 19474.
Feb 04 16:43:27 sv18006 sshd[2575018]: Accepted publickey for root from 192.168.24.28 port 59442 ssh2: RSA SHA256:4syiHux8uYb86xxxxxxxx
Feb 04 16:43:27 sv18006 sshd[2575018]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:43:27 sv18006 systemd-logind[1594]: New session 19475 of user root.
Feb 04 16:43:27 sv18006 systemd[1]: Started Session 19475 of user root.
Feb 04 16:43:27 sv18006 sshd[2575018]: Received disconnect from 192.168.24.28 port 59442:11: disconnected by user
Feb 04 16:43:27 sv18006 sshd[2575018]: Disconnected from 192.168.24.28 port 59442
Feb 04 16:43:27 sv18006 sshd[2575018]: pam_unix(sshd:session): session closed for user root
Feb 04 16:43:27 sv18006 systemd-logind[1594]: Removed session 19475.
Feb 04 16:43:27 sv18006 sshd[2575044]: Accepted publickey for root from 192.168.24.28 port 59444 ssh2: RSA SHA256:4syiHuxxxxxxxx
Feb 04 16:43:27 sv18006 sshd[2575044]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:43:27 sv18006 systemd-logind[1594]: New session 19476 of user root.
Feb 04 16:43:27 sv18006 systemd[1]: Started Session 19476 of user root.
Feb 04 16:43:27 sv18006 sshd[2575044]: Received disconnect from 192.168.24.28 port 59444:11: disconnected by user
Feb 04 16:43:27 sv18006 sshd[2575044]: Disconnected from 192.168.24.28 port 59444
Feb 04 16:43:27 sv18006 sshd[2575044]: pam_unix(sshd:session): session closed for user root
Feb 04 16:43:27 sv18006 systemd-logind[1594]: Removed session 19476.
Feb 04 16:43:27 sv18006 sshd[2575076]: Accepted publickey for root from 192.168.24.28 port 59446 ssh2: RSA SHA256:4syiHuxxxxxxx
Feb 04 16:43:27 sv18006 sshd[2575076]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:43:27 sv18006 systemd-logind[1594]: New session 19477 of user root.
Feb 04 16:43:27 sv18006 systemd[1]: Started Session 19477 of user root.
Feb 04 16:43:28 sv18006 qm[2575097]: <root@pam> starting task UPID:sv18006:00274AFE:2AC65547:5C585DA0:qmstart:8120:root@pam:
Feb 04 16:43:28 sv18006 qm[2575102]: start VM 8120: UPID:sv18006:00274AFE:2AC65547:5C585DA0:qmstart:8120:root@pam:
Feb 04 16:43:28 sv18006 systemd[1]: Started 8120.scope.
Feb 04 16:43:28 sv18006 systemd-udevd[2575118]: Could not generate persistent MAC address for tap8120i0: No such file or directory
Feb 04 16:43:28 sv18006 kernel: device tap8120i0 entered promiscuous mode
Feb 04 16:43:28 sv18006 kernel: vmbr112: port 4(tap8120i0) entered blocking state
Feb 04 16:43:28 sv18006 kernel: vmbr112: port 4(tap8120i0) entered disabled state
Feb 04 16:43:28 sv18006 kernel: vmbr112: port 4(tap8120i0) entered blocking state
Feb 04 16:43:28 sv18006 kernel: vmbr112: port 4(tap8120i0) entered forwarding state
Feb 04 16:43:28 sv18006 systemd-udevd[2575220]: Could not generate persistent MAC address for tap8120i1: No such file or directory
Feb 04 16:43:29 sv18006 kernel: device tap8120i1 entered promiscuous mode
Feb 04 16:43:29 sv18006 kernel: vmbr10: port 3(tap8120i1) entered blocking state
Feb 04 16:43:29 sv18006 kernel: vmbr10: port 3(tap8120i1) entered disabled state
Feb 04 16:43:29 sv18006 kernel: vmbr10: port 3(tap8120i1) entered blocking state
Feb 04 16:43:29 sv18006 kernel: vmbr10: port 3(tap8120i1) entered forwarding state
Feb 04 16:43:30 sv18006 qm[2575097]: <root@pam> end task UPID:sv18006:00274AFE:2AC65547:5C585DA0:qmstart:8120:root@pam: OK
Feb 04 16:43:30 sv18006 sshd[2575076]: Received disconnect from 192.168.24.28 port 59446:11: disconnected by user
Feb 04 16:43:30 sv18006 sshd[2575076]: Disconnected from 192.168.24.28 port 59446
Feb 04 16:43:30 sv18006 sshd[2575076]: pam_unix(sshd:session): session closed for user root
Feb 04 16:43:30 sv18006 systemd-logind[1594]: Removed session 19477.
Feb 04 16:43:30 sv18006 sshd[2575400]: Accepted publickey for root from 192.168.24.28 port 59448 ssh2: RSA SHA256:4syiHuxxxxxxxx
Feb 04 16:43:30 sv18006 sshd[2575400]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:43:30 sv18006 systemd-logind[1594]: New session 19478 of user root.
Feb 04 16:43:30 sv18006 systemd[1]: Started Session 19478 of user root.
Feb 04 16:44:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:44:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:44:04 sv18006 sshd[2575888]: Connection closed by 10.15.70.16 port 40139 [preauth]
Feb 04 16:44:36 sv18006 kernel: vmbr10: port 3(tap8120i1) entered disabled state
Feb 04 16:44:36 sv18006 kernel: vmbr10: port 3(tap8120i1) entered disabled state
Feb 04 16:44:37 sv18006 kernel: vmbr112: port 4(tap8120i0) entered disabled state
Feb 04 16:44:37 sv18006 kernel: vmbr112: port 4(tap8120i0) entered disabled state
Feb 04 16:44:37 sv18006 sshd[2575400]: Received disconnect from 192.168.24.28 port 59448:11: disconnected by user
Feb 04 16:44:37 sv18006 sshd[2575400]: Disconnected from 192.168.24.28 port 59448
Feb 04 16:44:37 sv18006 sshd[2575400]: pam_unix(sshd:session): session closed for user root
Feb 04 16:44:37 sv18006 systemd-logind[1594]: Removed session 19478.
Feb 04 16:44:43 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:44:45 sv18006 sshd[2576087]: Accepted publickey for root from 192.168.24.28 port 59572 ssh2: RSA SHA256:4syiHux8uYb8xxxxxxx
Feb 04 16:44:45 sv18006 sshd[2576087]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:44:45 sv18006 systemd-logind[1594]: New session 19479 of user root.
Feb 04 16:44:45 sv18006 systemd[1]: Started Session 19479 of user root.
Feb 04 16:44:46 sv18006 sshd[2576087]: Received disconnect from 192.168.24.28 port 59572:11: disconnected by user
Feb 04 16:44:46 sv18006 sshd[2576087]: Disconnected from 192.168.24.28 port 59572
Feb 04 16:44:46 sv18006 sshd[2576087]: pam_unix(sshd:session): session closed for user root
Feb 04 16:44:46 sv18006 systemd-logind[1594]: Removed session 19479.
Feb 04 16:44:46 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:45:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:45:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:45:01 sv18006 CRON[2576471]: pam_unix(cron:session): session opened for user root by (uid=0)
Feb 04 16:45:01 sv18006 CRON[2576472]: (root) CMD (command -v debian-sa1 > /dev/null && debian-sa1 1 1)
Feb 04 16:45:01 sv18006 CRON[2576471]: pam_unix(cron:session): session closed for user root
Feb 04 16:45:04 sv18006 sshd[2576509]: Connection closed by 10.15.70.16 port 41639 [preauth]
Feb 04 16:46:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:46:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:46:04 sv18006 sshd[2577044]: Connection closed by 10.15.70.16 port 43139 [preauth]
Feb 04 16:46:18 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:47:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:47:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:47:01 sv18006 puppet-agent[2577193]: xxxxx
Feb 04 16:47:01 sv18006 puppet-agent[2577193]: xxxxx
Feb 04 16:47:04 sv18006 puppet-agent[2577193]: Finished catalog run in 3.66 seconds
Feb 04 16:47:04 sv18006 sshd[2578830]: Connection closed by 10.15.70.16 port 44641 [preauth]
Feb 04 16:48:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:48:00 sv18006 systemd[1]: Started Proxmox VE replication runner.
Feb 04 16:48:04 sv18006 sshd[2579381]: Connection closed by 10.15.70.16 port 46139 [preauth]
Feb 04 16:48:33 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:48:33 sv18006 sshd[2579499]: Accepted publickey for root from 10.14.1.22 port 58998 ssh2: RSA SHA256:eb8EHxxxxxx
Feb 04 16:48:33 sv18006 sshd[2579499]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:48:33 sv18006 systemd-logind[1594]: New session 19481 of user root.
Feb 04 16:48:33 sv18006 systemd[1]: Started Session 19481 of user root.
Feb 04 16:48:34 sv18006 qm[2579513]: VM 8120 qmp command failed - VM 8120 not running
Feb 04 16:48:34 sv18006 sshd[2579499]: Received disconnect from 10.14.1.22 port 58998:11: disconnected by user
Feb 04 16:48:34 sv18006 sshd[2579499]: Disconnected from 10.14.1.22 port 58998
Feb 04 16:48:34 sv18006 sshd[2579499]: pam_unix(sshd:session): session closed for user root
Feb 04 16:48:34 sv18006 systemd-logind[1594]: Removed session 19481.
Feb 04 16:48:34 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:48:35 sv18006 pvedaemon[2514876]: <pistorit@pam> starting task UPID:sv18006:00275C49:2AC6CD3F:5C585ED3:qmstart:8120:pistorit@pam:
Feb 04 16:48:35 sv18006 pvedaemon[2579529]: start VM 8120: UPID:sv18006:00275C49:2AC6CD3F:5C585ED3:qmstart:8120:pistorit@pam:
Feb 04 16:48:35 sv18006 systemd[1]: Started 8120.scope.
Feb 04 16:48:35 sv18006 systemd-udevd[2579541]: Could not generate persistent MAC address for tap8120i0: No such file or directory
Feb 04 16:48:35 sv18006 kernel: device tap8120i0 entered promiscuous mode
Feb 04 16:48:35 sv18006 kernel: vmbr112: port 4(tap8120i0) entered blocking state
Feb 04 16:48:35 sv18006 kernel: vmbr112: port 4(tap8120i0) entered disabled state
Feb 04 16:48:35 sv18006 kernel: vmbr112: port 4(tap8120i0) entered blocking state
Feb 04 16:48:35 sv18006 kernel: vmbr112: port 4(tap8120i0) entered forwarding state
Feb 04 16:48:36 sv18006 systemd-udevd[2579642]: Could not generate persistent MAC address for tap8120i1: No such file or directory
Feb 04 16:48:36 sv18006 kernel: device tap8120i1 entered promiscuous mode
Feb 04 16:48:36 sv18006 kernel: vmbr10: port 3(tap8120i1) entered blocking state
Feb 04 16:48:36 sv18006 kernel: vmbr10: port 3(tap8120i1) entered disabled state
Feb 04 16:48:36 sv18006 kernel: vmbr10: port 3(tap8120i1) entered blocking state
Feb 04 16:48:36 sv18006 kernel: vmbr10: port 3(tap8120i1) entered forwarding state
Feb 04 16:48:37 sv18006 pvedaemon[2514876]: <pistorit@pam> end task UPID:sv18006:00275C49:2AC6CD3F:5C585ED3:qmstart:8120:pistorit@pam: OK
Feb 04 16:48:43 sv18006 pmxcfs[4003]: [status] notice: received log
Feb 04 16:48:43 sv18006 sshd[2579850]: Accepted publickey for root from 10.14.1.22 port 59028 ssh2: RSA SHA256:eb8EH3uxxxxxx
Feb 04 16:48:43 sv18006 sshd[2579850]: pam_unix(sshd:session): session opened for user root by (uid=0)
Feb 04 16:48:43 sv18006 systemd-logind[1594]: New session 19482 of user root.
Feb 04 16:48:43 sv18006 systemd[1]: Started Session 19482 of user root.
Feb 04 16:49:00 sv18006 systemd[1]: Starting Proxmox VE replication runner...
Feb 04 16:49:00 sv18006 systemd[1]: Started Proxmox VE replication runner.

We don't have any logs left.
 
Last edited:
We have enabled the Nested virtualization which seems to be a requirement for Proxmox and migrating VMs

No it isn't. And nesting is only is in the process of getting ready for migration at all, at least if nested guest are running, so I would be a bit cautious with that.
Did you changed anything else?
 
This is still happening and it is actually defeats the whole purpose of having Live migration.

2020-01-29 18:28:31 ERROR: tunnel replied 'ERR: resume failed - VM 180 qmp command 'query-status' failed - client closed connection' to command 'resume 180'
2020-01-29 18:28:40 ERROR: migration finished with problems (duration 00:00:17)
 
I've Been experiencing this issue also. Virtual Environment 6.1-7

I dropped HA configuration and changed the NAS Mapping from NFS to CIFS because of the NFS memory ballooning.

Migration appears to work okay now. However it worked fine when we first configured the cluster with V6 and used NFS.
 
Last edited:

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!