3par FC connected cluster issues

mkiehl

Renowned Member
Oct 15, 2010
2
0
66
When adding a volume to to our proxmox ccluster 5.1.41 we have encountered this error
upload_2018-4-21_18-11-9.png

The lvm.conf is as follows:

# Do not scan ZFS zvols (to avoid problems on ZFS zvols snapshots)
global_filter = [ "r|/dev/zd.*|", "r|/dev/mapper/pve-.*|" ]
#global_filter = [ "a|/dev/mapper/mpath.*|", "r|/dev/.*|", "a|/dev/mapper/pve-.*|" ]
# Example
We also set the lvmetad = 1

The multipath.conf is :
defaults {
polling_interval 10
user_friendly_names no
find_multipaths yes
}
devices {
device {
vendor "3PARdata"
product "VV"
path_grouping_policy multibus
path_selector "round-robin 0"
path_checker tur
features "0"
hardware_handler "0"
failback immediate
rr_weight uniform
no_path_retry 18
rr_min_io_rq 1
fast_io_fail_tmo 10
dev_loss_tmo 14
}
}
blacklist {
devnode "^(ram|raw|loop|fd|md|dm-|sr|scd|st)[0-9]*"
devnode "^(td|hd)[a-z]"
devnode "^dcssblk[0-9]*"
devnode "^cciss!c[0-9]d[0-9]*"
device {
vendor "DGC"
product "LUNZ"
}
device {
vendor "EMC"
product "LUNZ"
}
device {
vendor "IBM"
product "Universal Xport"
}
device {
vendor "IBM"
product "S/390.*"
}
device {
vendor "DELL"
product "Universal Xport"
}
device {
vendor "SGI"
product "Universal Xport"
}
device {
vendor "STK"
product "Universal Xport"
}
device {
vendor "SUN"
product "Universal Xport"
}
device {
vendor "(NETAPP|LSI|ENGENIO)"
product "Universal Xport"
}
}
blacklist_exceptions {
wwid "360002ac00000000000000006000206c3"
}
multipaths {
multipath {
wwid "360002ac00000000000000006000206c3"
alias 3par
}
}

We continue to get the errors listed above and cannot seem to fix this. It is almost like the
multipath_component_detection is not working. Has anyone seen this issue?


Best regards
 

Attachments

  • upload_2018-4-21_18-9-17.png
    upload_2018-4-21_18-9-17.png
    232.8 KB · Views: 6
  • upload_2018-4-21_18-10-13.png
    upload_2018-4-21_18-10-13.png
    135.1 KB · Views: 8
<code>
mpathb (360002ac00000000000000016000206c3) dm-2 3PARdata,VV
size=1.0T features='2 queue_if_no_path retain_attached_hw_handler' hwhandler='0' wp=rw
`-+- policy='round-robin 0' prio=50 status=active
|- 3:0:0:0 sdb 8:16 active ready running
|- 5:0:0:0 sdd 8:48 active ready running
|- 3:0:1:0 sdc 8:32 active ready running
`- 5:0:1:0 sde 8:64 active ready running
</code>
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!