You welcome.
Unfortunately i didn't made it to work on my old mobo and if you generally have the same kind of mobo and error you maybe just unlucky like me.
Now i'm totally sure that it's a bug in PVE.
@rearend-dev
Old mobo? Then and also check one little thing:
There was some kind of transition period when HDD's exceeded capabilities of MBR (over 2TB) but new stuff like UEFI+GPT wasn't common.
Gigabyte produced transitional mobos that could manage 2TB+ HDD's using so called Hybrid EFI. It...
Hi @rearend-dev
I had semi-fixed this problem by changing my hardware and I'm still pretty sure that this problem caused by some UEFI boot problems.
Even using new MB+CPU+RAM i can reproduce it with this steps:
1. Burn USB-key with Rufus using any of 2 methods: GPT+DD or MBR+DD
2. Choose "UEFI...
Hi damon!
Thank you for answer, i'm already found info about chooseleaf osd mode before. It will make my single node ceph to run properly for sure.
But what to do with MON's store.db? After SSD failure it will be lost and i can't add back old OSD's without old store.db. I tried to search about...
Hi damon!
Yeah, i thinked about that too but NAS is not interesting at all. I can spend few hundred dollars to buy new MB that can boot to UEFI and use 1-st variant.
Integrated RAID on my MB works as a shit but automatic backups are good idea, i will think about it.
Isn't there any option to...
Hello!
I'm going to setup my home lab to play with VMs and store different files on it.
General idea:
1. Single node built from spare old hardware
2. Four different HDD's + Two SSD's
3. Ceph storage with journal on SSD or SSD cache-tier
4. Maximum storage Fault Tolerance that i can reach with...
Hello again!
So, what i've understood for these days:
1. My problem caused by my motherboard for sure. It can't boot to UEFI devices, my BIOS just don't support this.
2. Also i can't install PVE on ZFS using old BIOS boot method.
3. The problem is not in the USB device or not in SSD disks.
Ok...
Okay, i tested my USB stick using VMware home lab, here is the info i got:
1. VMware can't boot from USB using BIOS boot method so i tested only UEFI boot.
2. When i started installation i noticed that PVE know that it boots from UEFI:
When i tried to install PVE yesterday on my bare-metal...
They have like 8Gb difference but i don't think it's a problem. I can't init RAID 0 even on one disk and i'm sure that ZFS can do this.
My mobo support 2 slots of SATA3 but doing it using additional Intel chip. So i can say that SATA3 on my board (where SSD disks is) is internal and external at...
Nope, still unlucky, still can't work with ZFS. Thought it was broken SSD but i was wrong :(
I tried another disks, only one disk connected, different combinations of BIOS SATA/RAID/EFI/ settings, etc.
Tried to make ZFS as intended by dev's, i mean install on 2 similar disks, same error every...
Haha, my small old Intel SSD just finally died :)
Somehow that SSD was the problem, even if i tried to install PVE on another disks...
Maybe SATA controller failed cause of it or so...
I will reconfigure my test server, install PVE on live disks and immediately return to you with answer!
Anyway...
So if first node fail with OS disk i just need standart steps to recover it?:
1. Remove from cluster
2. Change OS disk, reinstall system, configure
3. Add it back to cluster and recreate MON
4. MON will be synced from 2-nd node (MON only) and i could run old OSD's
Am i right?
Sorry for necroposting but what will happen if we improve ccloyd's question a bit:
First node is a ceph storage with 4 HDD's + some OS disk.
Second node is monitor only node with OS disk and no storage.
Question: can we somehow recover first node's OSD's if first node's OS disk will fail?
Of cource :)
I know i'm looking as a newbie but i'm familiar with linux and IT. I'm just trying to fix this approach like a whole day so i'm feeling bit confused and writing my answers like a newbie.
No offence :)
UPD:
I just tried another physical disk for installation, same error :(
UPD2...
Tried to make a new stick using Rufus this time, i can successfully make it bootable using method from this post: https://forum.proxmox.com/threads/usb-install-better-options-needed.34279/post-169744
Tried to install using debug, i have this output:
PVE installed successfully and i still have...
Yes, i used ZFS RAID0 at one (smallest) SSD first, installed Proxmox and tried to boot from one SSD.
1. Install seems ok, i cannot boot afterwards from one SSD.
2. Looks like it's systemd-boot cause i didn't saw any blue screens and also "cat /proc/cmdline" points at EFI loader
3. Paste form...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.