system disk number changed when I add new drive

Ting

Member
Oct 19, 2021
99
4
13
56
Hi, I am setting up 3 nodes cluster, on same machine ibm x3650 m4, Here is my setup:

1. I have one onboard raid card, which is having two disks in jbod mode each, I installed proxmox 7 on zfs (raid 1) on these two disks. after install, these two disks are labeled as /dev/sda, sdb.
2. I have other raid card pluged in one pci-e slot, this card is flahsed with IT mode, I am plan to use this card to handle all disks for VMs, and ceph disks.

Here is my question, by some reason, when I insert a new disk on my IT mode raid card, the system auto assign disk label /dev/sda to this disk, and my two system disks (zfs raid 1) becomes /dev/sdb, sdc.

will this cause any future issue with my VM disk or ceph disk? please me know. thanks.
 
That disks are switching between sda, sdb, sdc and so on is normal and can cause problems. Thats why you shouldn't use them and should use uuids instead as the disk identifier. UUIDs won't change.