[SOLVED] Failed to create new lvm correctly HAYELP HAYELP!!!

Jack Freeman

New Member
May 28, 2019
27
0
1
59
Hiya folks, I done goofed agin, followed instructions from website that looked like it knew what it was saying but ended up with unusable volume. It is showing in node - disks - lvm as 'newdrive sdc1' I gave it the sdc1 which was correct as far as instructions (this drive listed as sdc in drive list for node) but 'newdrive' looks to have been in the instructions I lazily copypasta'd into terminal directly without considering fully the consequences (noob), anyone fancy giving an ole fella a hand crossing the road?? It looks like it just needs lv relabelling but I cannot find instructions on how to achieve this.
edit* forgot to mention, 4 x 146g sas disks in a hpe dl380g7 running in raid 5 through a P410i raid card
ta devs.
 
Last edited:
output of pvesm status;
root@hpe380:~# pvesm status
Name Type Status Total Used Available %
local dir active 30316484 9356096 19397356 30.86%
local-lvm lvmthin active 67620864 44190234 23430629 65.35%
root@hpe380:~# ^C
no sign of the elusive 'newdrive' volume here??
 
Ok, all day bangin my head against this and am here; lvm renamed and successfully added to pve - data but now my new 'empty' drive is showing @ 100%, any suggestions??
 

Attachments

  • lvm fullreport.txt
    11.1 KB · Views: 5
Hello,

FYI, you attachment is very hard to read, additionally it might be better idea to wrap the output inside [ code ][ /code ] tags.

Cheers
 
edited file;
[ code ]
lvm> fullreport

Fmt VG UUID VG Attr VPerms Extendable Exported Partial AllocPol Clustered VSize VFree SYS ID System ID LockType VLockArgs Ext #Ext Free MaxLV MaxPV #PV #PV Missing #LV #SN Seq VG Tags VProfile #VMda #VMdaUse VMdaFree VMdaSize #VMdaCps

lvm2 JY8HZT-6Y6b-0XQl-6fYn-bejS-K2Ib-EakrgI pve wz--n- writeable extendable normal 392.07g 288.09g 4.00m 100371 73750 0 0 2 0 5 0 16 2 2 506.00k 1020.00k unmanaged

Fmt PV UUID DevSize PV Maj Min PMdaFree PMdaSize PExtVsn 1st PE PSize PFree Used Attr Allocatable Exported Missing PE Alloc PV Tags #PMda #PMdaUse BA Start BA Size PInUse Duplicate

lvm2 gql35V-odZb-F15t-ZFXi-OlOe-epcb-KnyaWd 118.74g /dev/sda3 8 3 506.00k 1020.00k 2 1.00m 118.74g 14.75g 103.99g a-- allocatable 30397 26621 1 1 0 0 used

lvm2 C3B0bq-795h-Ey4O-ZTxZ-PAnv-W30m-LP7itr 273.34g /dev/sdc1 8 33 506.00k 1020.00k 2 1.00m 273.34g 273.34g 0 a-- allocatable 69974 0 1 1 0 0 used

LV UUID LV LV Path DMPath Parent Attr Layout Role InitImgSyn ImgSynced Merging Converting AllocPol AllocLock FixMin MergeFailed SnapInvalid SkipAct WhenFull Active ActLocal ActRemote ActExcl Maj Min Rahead LSize MSize #Seg Origin Origin UUID OSize Ancestors FAncestors Descendants FDescendants Data% Snap% Meta% Cpy%Sync Cpy%Sync Mismatches SyncAction WBehind MinSync MaxSync Move Move UUID Convert Convert UUID Log Log UUID Data Data UUID Meta Meta UUID Pool Pool UUID LV Tags LProfile LLockArgs CTime RTime Host Modules Historical KMaj KMin KRahead LPerms Suspended LiveTable InactiveTable DevOpen CacheTotalBlocks CacheUsedBlocks CacheDirtyBlocks CacheReadHits CacheReadMisses CacheWriteHits CacheWriteMisses KCacheSettings KCachePolicy Health KDiscards CheckNeeded

kO8FKG-QHrQ-7GKn-Ht5B-jW8S-C1J7-IQdUpx data pve/data /dev/mapper/pve-data twi-aotz-- thin,pool private inherit unknown unknown queue active active locally active exclusively -1 -1 auto 64.49g 1.00g 1 0.00 0.00 0.03 [data_tdata] 3jDXIm-41ZT-ADvS-7sNa-sEOb-voND-2y1Qvq [data_tmeta] S7lgry-TYok-ewVH-E9Ld-VNCE-6QwW-T3tc8t 2019-05-27 17:46:18 +0100 proxmox thin-pool 253 4 128.00k writeable live table present open passdown

KgcgY4-PyRj-u27r-cj6C-I5v0-mQcS-kes5Yu root pve/root /dev/pve/root /dev/mapper/pve-root -wi-ao---- linear public inherit unknown unknown active active locally active exclusively -1 -1 auto 29.50g 1 2019-05-27 17:46:17 +0100 proxmox 253 1 128.00k writeable live table present open unknown

vODzM6-JQ4l-UCPM-j5SF-shW1-ENmz-zHd4ey swap pve/swap /dev/pve/swap /dev/mapper/pve-swap -wi-ao---- linear public inherit unknown unknown active active locally active exclusively -1 -1 auto 8.00g 1 2019-05-27 17:46:17 +0100 proxmox 253 0 128.00k writeable live table present open unknown

C3Yscr-EW0u-7ldu-MYVI-fMkj-YzPt-9Xk4fH vm-103-disk-0 pve/vm-103-disk-0 /dev/pve/vm-103-disk-0 /dev/mapper/pve-vm--103--disk--0 Vwi-a-tz-- thin,sparse public inherit unknown unknown active active locally active exclusively -1 -1 auto 60.00g 1 0.00 0.00 data kO8FKG-QHrQ-7GKn-Ht5B-jW8S-C1J7-IQdUpx 2019-05-29 23:17:56 +0100 hpe380 thin,thin-pool 253 6 128.00k writeable live table present unknown

Zyfod3-PRY5-8Nwm-d7rV-Dj1o-7ceu-LugzmN vm-120-disk-0 pve/vm-120-disk-0 /dev/pve/vm-120-disk-0 /dev/mapper/pve-vm--120--disk--0 Vwi-a-tz-- thin,sparse public inherit unknown unknown active active locally active exclusively -1 -1 auto 100.00g 1 0.00 0.00 data kO8FKG-QHrQ-7GKn-Ht5B-jW8S-C1J7-IQdUpx 2019-05-29 23:56:07 +0100 hpe380 thin,thin-pool 253 7 128.00k writeable live table present unknown

Start SSize PV UUID LV UUID

0 69974 C3B0bq-795h-Ey4O-ZTxZ-PAnv-W30m-LP7itr

0 2048 gql35V-odZb-F15t-ZFXi-OlOe-epcb-KnyaWd vODzM6-JQ4l-UCPM-j5SF-shW1-ENmz-zHd4ey

2048 7552 gql35V-odZb-F15t-ZFXi-OlOe-epcb-KnyaWd KgcgY4-PyRj-u27r-cj6C-I5v0-mQcS-kes5Yu

9600 16509 gql35V-odZb-F15t-ZFXi-OlOe-epcb-KnyaWd 3jDXIm-41ZT-ADvS-7sNa-sEOb-voND-2y1Qvq

26109 256 gql35V-odZb-F15t-ZFXi-OlOe-epcb-KnyaWd S7lgry-TYok-ewVH-E9Ld-VNCE-6QwW-T3tc8t

26365 256 gql35V-odZb-F15t-ZFXi-OlOe-epcb-KnyaWd tZzBf7-NlTu-Rap2-H3dp-cMAp-lpp8-cmtgnB

26621 3776 gql35V-odZb-F15t-ZFXi-OlOe-epcb-KnyaWd

Type #Str Stripe Region Chunk #Thins Discards CacheMode Zero TransId ThId Start Start SSize SSize Seg Tags PE Ranges LE Ranges Metadata LE Ranges Devices Metadata Devs Monitor CachePolicy CacheSettings LV UUID

thin 0 0 0 0 passdown zero 2 1 0 0 60.00g 15360 C3Yscr-EW0u-7ldu-MYVI-fMkj-YzPt-9Xk4fH

linear 1 0 0 0 unknown 0 0 29.50g 7552 /dev/sda3:2048-9599 /dev/sda3:2048-9599 /dev/sda3(2048) KgcgY4-PyRj-u27r-cj6C-I5v0-mQcS-kes5Yu

thin 0 0 0 0 passdown zero 3 2 0 0 100.00g 25600 Zyfod3-PRY5-8Nwm-d7rV-Dj1o-7ceu-LugzmN

thin-pool 1 0 0 64.00k 2 passdown zero 4 0 0 64.49g 16509 data_tdata:0-16508 [data_tdata]:0-16508 data_tdata(0) monitored kO8FKG-QHrQ-7GKn-Ht5B-jW8S-C1J7-IQdUpx

linear 1 0 0 0 unknown 0 0 8.00g 2048 /dev/sda3:0-2047 /dev/sda3:0-2047 /dev/sda3(0) vODzM6-JQ4l-UCPM-j5SF-shW1-ENmz-zHd4ey

[ /code ]
like so?
 
Thanks for your sage guidance lhorace, I realise you may not know wtf you did to guide me but it was your advice on adopting zfs over basic lvm, I did so, took a fat minute or 10 but I now have a 250g 4 drive zfs pool with a win 10 vm running so far happily, thanks fella, appreciated.
 

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!