SAN shared directory (Proxmox v2.0 rc1)

Jina Kumar

New Member
Feb 22, 2012
7
0
1
34
Switzerland
Hello,

I have a some concern about my installation

I’ve installed 2 servers with Proxmox v2.0 rc1 and now I am trying to create a two-node cluster with high availability (I’ve read the documentation for that).
Each server has its own system partition and they both share a data partition located on the SAN (connected with fibre channel).
On that shared partition I’ve created a shared directory with the Central Web-based Management.
My target is do fencing between the servers.

Is that correct or have you got a better idea?

Any idea is welcome
Thank you
 
Last edited:
...
Each server has its own system partition and they both share a data partition located on the SAN (connected with fibre channel).
On that shared partition I’ve created a shared directory with the Central Web-based Management.
My target is do fencing between the servers.

Is that correct or have you got a better idea?
Hi,
not realy. Normaly you provide an disk via SAN (iSCSI/FC) to the nodes. You have to create an volumegroup on this disk and add this as lvm-storage in the pve-gui. Your vm-disks will be created as logical volumes on this volume-group.

If you use an filesystem on the shared disk (I don't mean nfs-shares) you need an cluster-fs (to mount on several nodes) - in other cases you will get data-loss!

Udo
 
How can I do this?

Why is an shared disk with lvm-storage not ok?

And how to do this: depends on your storage.

If you realy wan't a "real" filesystem use nfs, or play with an cluster-fs (I don't have experience with cluster-fs).
I prefer lvm-storage (I use this with FC and some nodes for years (since shared storage is available in pve).

Udo
 
It doesn't work with HA cluster

KVM:
task started by HA resource agent
TASK ERROR: volume 'SAN:116/vm-116-disk-1.raw' does not exist

OpenVZ:
task started by HA resource agent
Starting container ...
Container private area /var/lib/vz/private/111 does not exist
TASK ERROR: command 'vzctl start 111' failed: exit code 43
 
Last edited: