Hello guys
I have 2 server DELL PE R620
on which I have just implemented a structure of cluster 2 nodes with DRBD
with a quorum of iscsi and fencing on idrac
before putting into production the cluster I tested a VM live migration with KVM migration test but failed with the following log:
pveversion in both node
then I tried to make an offline migration without any problem
some idea?
I have 2 server DELL PE R620
on which I have just implemented a structure of cluster 2 nodes with DRBD
with a quorum of iscsi and fencing on idrac
before putting into production the cluster I tested a VM live migration with KVM migration test but failed with the following log:
Code:
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:21 starting migration of VM 100 to node 'pve10' (192.168.111.10)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:28 copying disk images[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:28 starting VM 100 on remote node 'pve10'[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:30 starting ssh migration tunnel[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:31 starting online/live migration on localhost:60000[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:31 migrate_set_speed: 8589934592[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:31 migrate_set_downtime: 0.1[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:33 migration status: active (transferred 7879184, remaining 1072865280), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:35 migration status: active (transferred 11851920, remaining 1068900352), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:37 migration status: active (transferred 15824656, remaining 1065058304), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:39 migration status: active (transferred 21103029, remaining 1057812480), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:41 migration status: active (transferred 24490163, remaining 1053474816), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:43 migration status: active (transferred 26038061, remaining 1051836416), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:45 migration status: active (transferred 30008367, remaining 1045155840), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:47 migration status: active (transferred 34384671, remaining 1038438400), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:49 migration status: active (transferred 39216050, remaining 1029480448), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:51 migration status: active (transferred 40976905, remaining 1027629056), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:53 migration status: active (transferred 44884634, remaining 1021435904), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:55 migration status: active (transferred 48341383, remaining 1017475072), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:57 migration status: active (transferred 51974361, remaining 981680128), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:23:59 migration status: active (transferred 57621886, remaining 958672896), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:01 migration status: active (transferred 61761815, remaining 952012800), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:03 migration status: active (transferred 69037752, remaining 885493760), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:05 migration status: active (transferred 72187233, remaining 877916160), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:07 migration status: active (transferred 77411965, remaining 823758848), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:09 migration status: active (transferred 79164613, remaining 821932032), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:11 migration status: active (transferred 82744788, remaining 817840128), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:13 migration status: active (transferred 85150690, remaining 784629760), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:15 migration status: active (transferred 89179172, remaining 776237056), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:17 migration status: active (transferred 91932573, remaining 768311296), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:19 migration status: active (transferred 96193002, remaining 759926784), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:21 migration status: active (transferred 98569276, remaining 757678080), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:23 migration status: active (transferred 103236388, remaining 752791552), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:25 migration status: active (transferred 108898465, remaining 694710272), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:27 migration status: active (transferred 114244760, remaining 668282880), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:29 migration status: active (transferred 119805048, remaining 573186048), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:31 migration status: active (transferred 124545807, remaining 566886400), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:33 migration status: active (transferred 128381563, remaining 533319680), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:35 migration status: active (transferred 134476693, remaining 459948032), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:37 migration status: active (transferred 138308840, remaining 373964800), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:39 migration status: active (transferred 140431712, remaining 370663424), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:41 migration status: active (transferred 143649799, remaining 365162496), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:43 migration status: active (transferred 148773588, remaining 355794944), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:45 migration status: active (transferred 151687476, remaining 352755712), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:48 migration status: active (transferred 154601364, remaining 349847552), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:50 migration status: active (transferred 158812203, remaining 345485312), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:52 migration status: active (transferred 162329801, remaining 341917696), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:54 migration status: active (transferred 165465395, remaining 339009536), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:56 migration status: active (transferred 169252002, remaining 331796480), total 1082855424)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Write failed: Broken pipe[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:58 ERROR: online migrate failure - aborting[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:58 aborting phase 2 - cleanup resources[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:58 migrate_cancel[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]Jul 03 10:24:59 ERROR: migration finished with problems (duration 00:01:38)[/FONT][/COLOR]
[COLOR=#000000][FONT=tahoma]TASK ERROR: migration problems[/FONT][/COLOR]
Code:
root@pve10:/etc/pve# cat storage.cfg
dir: local
path /var/lib/vz
content images,iso,vztmpl,rootdir
maxfiles 0
iscsi: quorum
target iqn.2004-04.com.qnap:ts-469u:iscsi.qnap.e9a816
portal 192.168.111.9
content none
nfs: qnapNAS
path /mnt/pve/qnapNAS
server 192.168.111.7
export /backupVM
options vers=3
content iso,backup
maxfiles 2
lvm: DRBD
vgname cup-drbd-vg
content images
shared
pveversion in both node
Code:
root@pve10:/etc/pve# pveversion -v
proxmox-ve-2.6.32: 3.4-157 (running kernel: 2.6.32-39-pve)
pve-manager: 3.4-6 (running version: 3.4-6/102d4547)
pve-kernel-2.6.32-39-pve: 2.6.32-157
pve-kernel-2.6.32-37-pve: 2.6.32-150
lvm2: 2.02.98-pve4
clvm: 2.02.98-pve4
corosync-pve: 1.4.7-1
openais-pve: 1.1.4-3
libqb0: 0.11.1-2
redhat-cluster-pve: 3.2.0-2
resource-agents-pve: 3.9.2-4
fence-agents-pve: 4.0.10-2
pve-cluster: 3.0-18
qemu-server: 3.4-6
pve-firmware: 1.1-4
libpve-common-perl: 3.0-24
libpve-access-control: 3.0-16
libpve-storage-perl: 3.0-33
pve-libspice-server1: 0.12.4-3
vncterm: 1.1-8
vzctl: 4.0-1pve6
vzprocps: 2.0.11-2
vzquota: 3.1-2
pve-qemu-kvm: 2.2-10
ksm-control-daemon: 1.1-1
glusterfs-client: 3.5.2-1
Code:
root@pve10:/etc/pve# qm config 100
bootdisk: ide0
cores: 1
ide0: DRBD:vm-100-disk-1,size=4G
ide2: none,media=cdrom
memory: 1024
name: corelinux
net0: e1000=6A:67:1D:58:C7:B3,bridge=vmbr0
numa: 0
ostype: l26
smbios1: uuid=67fc36a0-0c35-49c0-842e-6b9017e6a204
sockets: 1
Code:
root@pve10:/# service drbd status
drbd driver loaded OK; device status:
version: 8.3.13 (api:88/proto:86-96)
GIT-hash: 83ca112086600faacab2f157bc5a9324f7bd7f77 build by root@sighted, 2012-10-09 12:47:51
m:res cs ro ds p mounted fstype
1:r1 Connected Primary/Primary UpToDate/UpToDate C
Code:
<?xml version="1.0"?>
<cluster name="cupcluster" config_version="4">
<cman keyfile="/var/lib/pve-cluster/corosync.authkey" expected_votes="3">
</cman>
<fencedevices>
<fencedevice agent="fence_drac5" cmd_prompt="admin1->" ipaddr="192.168.111.20" login="fencing_user" name="pve10-drac" passwd="1q2w3e4r" secure="1"/>
<fencedevice agent="fence_drac5" cmd_prompt="admin1->" ipaddr="192.168.111.21" login="fencing_user" name="pve11-drac" passwd="1q2w3e4r" secure="1"/>
</fencedevices>
<quorumd votes="1" allow_kill="0" interval="1" label="quorumcup1" tko="10">
<heuristic interval="3" program="ping $GATEWAY -c1 -w1" score="1" tko="4"/>
<heuristic interval="3" program="ip addr | grep bond0 | grep -q UP" score="2" tko="3"/>
</quorumd>
<totem token="30000"/>
<clusternodes>
<clusternode name="pve10" votes="1" nodeid="1">
<fence>
<method name="1">
<device name="pve10-drac"/>
</method>
</fence>
</clusternode>
<clusternode name="pve11" votes="1" nodeid="2">
<fence>
<method name="1">
<device name="pve11-drac"/>
</method>
</fence>
</clusternode>
</clusternodes>
</cluster>
then I tried to make an offline migration without any problem
some idea?