Search results

  1. C

    Upgrade from 5.x to 6.x LXC containers will not start

    This is still a bug and not fixed. i have an entire node offline since it cannot start any LXC Containers. Can someone look into this? It seems to be cause I use zfs and its not mounted properly I think?
  2. C

    Upgrade from 5.x to 6.x LXC containers will not start

    I don't know how to enter the dir of the container. I just saw them mounted in df -h
  3. C

    Upgrade from 5.x to 6.x LXC containers will not start

    Tried a few things and below are the results. Seems maybe the zfs setup is bugged? root@prox2:~# pct mount 102 mounting container failed cannot open directory //rpool/data/subvol-102-disk-1: No such file or directory root@prox2:~# pct mount 102^C root@prox2:~# ^C root@prox2:~# zfs list NAME...
  4. C

    LXC Container will not backup

    proxmox hosts does write logfiles to there. root@prox1:~# cat /etc/pve/storage.cfg dir: local path /var/lib/vz content iso,images,rootdir,backup,vztmpl lvmthin: local-lvm thinpool data vgname pve content images,rootdir nfs: Synology export...
  5. C

    LXC Container will not backup

    I get the below error message when trying to backup an lxc container. its a tiny one. I have 2 other lxc that backup fine, and 3 other KVM Vm's that backup fine. Some guidance is appreciated. The permission died error makes no sense as its writing the log files to it fine. Task viewer...
  6. C

    Wiped 4.x installing 5.1 errors out

    Playing around, I was able to wipe and reinstall zfs raid 1 fine on the SSD's, but no matter what I did to the 4x4TB drives, be it zpool clearlabel etc, or even formatting them on windows... they wont work. I was able to create a raid10 zfs manually once i booted off the raid1 ssd's I am...
  7. C

    [SOLVED] Replication with different target storage name

    Any update on this? I just was trying to do this and found this post that its not possible :(
  8. C

    Wiped 4.x installing 5.1 errors out

    I'm trying a ZFS Raid1 and its working. Put it on two ssd's and gonna have the zfs raid10 as a datastore only. As I've seen on this forum, its something to do with RAID10 not wanting to install.
  9. C

    Wiped 4.x installing 5.1 errors out

    Running 5.1-3 Installer and it installs fine if i choose ext4.
  10. C

    Wiped 4.x installing 5.1 errors out

    Finally got back to it, 4.4 fails to install now for the same reason.
  11. C

    Wiped 4.x installing 5.1 errors out

    This server ran 4.4 fine. I killed it and 10 minutes later tried to install 5.1 with these errors I'll try the 4.4 install tomorrow to verify it still works.
  12. C

    Wiped 4.x installing 5.1 errors out

    It doesn't look like it does, screenshot attached.
  13. C

    Wiped 4.x installing 5.1 errors out

    rescue doesnt work. screenshot attached
  14. C

    Wiped 4.x installing 5.1 errors out

    Figured it out, Debug mode as you boot the installer, (2nd choice) Attached is install.log
  15. C

    Wiped 4.x installing 5.1 errors out

    How do I get to the debug shell? It's still sitting on the same screen in the screenshot. CTRL C etc does nothing.
  16. C

    Wiped 4.x installing 5.1 errors out

    Setup a zfs raid6 on 4x1TB drives on Proxmox 4.x, 2 were replaced with 2TB drives. I wiped this server now and am now trying to install with 4x2TB Drives (2 of those from the previous install. I am having issues and it errors out when its done with unable to unmount zfs, I talked on the IRC...