[SOLVED] Unable to Create or Restore LXC Containers in Proxmox with Ceph Storage: TASK ERROR with mkfs.ext4

w453y

New Member
Nov 15, 2024
5
2
3
20
India
I recently migrated to a new Proxmox cluster that uses Ceph as the storage backend. However, I encounter issues when restoring containers from backups and even while creating new containers directly in the new cluster.

These are the logs while restoring the container:

Code:
recovering backed-up configuration from 'pbs:backup/ct/119/2024-11-22T19:49:30Z'
/dev/rbd0
The file /dev/rbd-pve/88a04bc8-8bc0-4122-9e8a-45b986d3f5a4/IRSI-Ceph-Pool/vm-105-disk-0 does not exist and no size was specified.
Removing image: 1% complete...
Removing image: 2% complete...
Removing image: 3% complete...
Removing image: 4% complete...
Removing image: 5% complete...
Removing image: 6% complete...
Removing image: 7% complete...
Removing image: 8% complete...
Removing image: 9% complete...
Removing image: 10% complete...
Removing image: 11% complete...
Removing image: 12% complete...
Removing image: 13% complete...
Removing image: 14% complete...
Removing image: 15% complete...
Removing image: 16% complete...
Removing image: 17% complete...
Removing image: 18% complete...
Removing image: 19% complete...
Removing image: 20% complete...
Removing image: 21% complete...
Removing image: 22% complete...
Removing image: 23% complete...
Removing image: 24% complete...
Removing image: 25% complete...
Removing image: 26% complete...
Removing image: 27% complete...
Removing image: 28% complete...
Removing image: 29% complete...
Removing image: 30% complete...
Removing image: 31% complete...
Removing image: 32% complete...
Removing image: 33% complete...
Removing image: 34% complete...
Removing image: 35% complete...
Removing image: 36% complete...
Removing image: 37% complete...
Removing image: 38% complete...
Removing image: 39% complete...
Removing image: 40% complete...
Removing image: 41% complete...
Removing image: 42% complete...
Removing image: 43% complete...
Removing image: 44% complete...
Removing image: 45% complete...
Removing image: 46% complete...
Removing image: 47% complete...
Removing image: 48% complete...
Removing image: 49% complete...
Removing image: 50% complete...
Removing image: 51% complete...
Removing image: 52% complete...
Removing image: 53% complete...
Removing image: 54% complete...
Removing image: 55% complete...
Removing image: 56% complete...
Removing image: 57% complete...
Removing image: 58% complete...
Removing image: 59% complete...
Removing image: 60% complete...
Removing image: 61% complete...
Removing image: 62% complete...
Removing image: 63% complete...
Removing image: 64% complete...
Removing image: 65% complete...
Removing image: 66% complete...
Removing image: 67% complete...
Removing image: 68% complete...
Removing image: 69% complete...
Removing image: 70% complete...
Removing image: 71% complete...
Removing image: 72% complete...
Removing image: 73% complete...
Removing image: 74% complete...
Removing image: 75% complete...
Removing image: 76% complete...
Removing image: 77% complete...
Removing image: 78% complete...
Removing image: 79% complete...
Removing image: 80% complete...
Removing image: 81% complete...
Removing image: 82% complete...
Removing image: 83% complete...
Removing image: 84% complete...
Removing image: 85% complete...
Removing image: 86% complete...
Removing image: 87% complete...
Removing image: 88% complete...
Removing image: 89% complete...
Removing image: 90% complete...
Removing image: 91% complete...
Removing image: 92% complete...
Removing image: 93% complete...
Removing image: 94% complete...
Removing image: 95% complete...
Removing image: 96% complete...
Removing image: 97% complete...
Removing image: 98% complete...
Removing image: 99% complete...
Removing image: 100% complete...done.
TASK ERROR: unable to restore CT 105 - command 'mkfs.ext4 -O mmp -E 'root_owner=100000:100000' /dev/rbd-pve/88a04bc8-8bc0-4122-9e8a-45b986d3f5a4/IRSI-Ceph-Pool/vm-105-disk-0' failed: exit code 1

These are the logs while creating the container:

Code:
/dev/rbd0
The file /dev/rbd-pve/88a04bc8-8bc0-4122-9e8a-45b986d3f5a4/IRSI-Ceph-Pool/vm-108-disk-0 does not exist and no size was specified.
Removing image: 1% complete...
Removing image: 2% complete...
Removing image: 3% complete...
Removing image: 4% complete...
Removing image: 5% complete...
Removing image: 6% complete...
Removing image: 7% complete...
Removing image: 8% complete...
Removing image: 9% complete...
Removing image: 10% complete...
Removing image: 11% complete...
Removing image: 12% complete...
Removing image: 13% complete...
Removing image: 14% complete...
Removing image: 15% complete...
Removing image: 16% complete...
Removing image: 17% complete...
Removing image: 18% complete...
Removing image: 19% complete...
Removing image: 20% complete...
Removing image: 21% complete...
Removing image: 22% complete...
Removing image: 23% complete...
Removing image: 24% complete...
Removing image: 25% complete...
Removing image: 26% complete...
Removing image: 27% complete...
Removing image: 28% complete...
Removing image: 29% complete...
Removing image: 30% complete...
Removing image: 31% complete...
Removing image: 32% complete...
Removing image: 33% complete...
Removing image: 34% complete...
Removing image: 35% complete...
Removing image: 36% complete...
Removing image: 37% complete...
Removing image: 38% complete...
Removing image: 39% complete...
Removing image: 40% complete...
Removing image: 41% complete...
Removing image: 42% complete...
Removing image: 43% complete...
Removing image: 44% complete...
Removing image: 45% complete...
Removing image: 46% complete...
Removing image: 47% complete...
Removing image: 48% complete...
Removing image: 49% complete...
Removing image: 50% complete...
Removing image: 51% complete...
Removing image: 52% complete...
Removing image: 53% complete...
Removing image: 54% complete...
Removing image: 55% complete...
Removing image: 56% complete...
Removing image: 57% complete...
Removing image: 58% complete...
Removing image: 59% complete...
Removing image: 60% complete...
Removing image: 61% complete...
Removing image: 62% complete...
Removing image: 63% complete...
Removing image: 64% complete...
Removing image: 65% complete...
Removing image: 66% complete...
Removing image: 67% complete...
Removing image: 68% complete...
Removing image: 69% complete...
Removing image: 70% complete...
Removing image: 71% complete...
Removing image: 72% complete...
Removing image: 73% complete...
Removing image: 74% complete...
Removing image: 75% complete...
Removing image: 76% complete...
Removing image: 77% complete...
Removing image: 78% complete...
Removing image: 79% complete...
Removing image: 80% complete...
Removing image: 81% complete...
Removing image: 82% complete...
Removing image: 83% complete...
Removing image: 84% complete...
Removing image: 85% complete...
Removing image: 86% complete...
Removing image: 87% complete...
Removing image: 88% complete...
Removing image: 89% complete...
Removing image: 90% complete...
Removing image: 91% complete...
Removing image: 92% complete...
Removing image: 93% complete...
Removing image: 94% complete...
Removing image: 95% complete...
Removing image: 96% complete...
Removing image: 97% complete...
Removing image: 98% complete...
Removing image: 99% complete...
Removing image: 100% complete...done.
TASK ERROR: unable to create CT 108 - command 'mkfs.ext4 -O mmp -E 'root_owner=100000:100000' /dev/rbd-pve/88a04bc8-8bc0-4122-9e8a-45b986d3f5a4/IRSI-Ceph-Pool/vm-108-disk-0' failed: exit code 1

But Ceph seems good, I don't see any error there.

Screenshot_20241217_144931.png


Why is this error occurring? and How can I fix this?
 
I don't use CEPH, but is the KRBD Enable unchecked in the storage configuration?
If it is try enabling it & starting again.

Good luck.
 
I don't use CEPH, but is the KRBD Enable unchecked in the storage configuration?
If it is try enabling it & starting again.

Good luck.
Hey Thanks for the reply, I just noticed I can restore backups and create containers on other nodes except node1 and node4 ( I have 5 nodes cluster ). BTW where can I find `KRBD Enable` option?
 
Hey Thanks for the reply, I just noticed I can restore backups and create containers on other nodes except node1 and node4 ( I have 5 nodes cluster ). BTW where can I find `KRBD Enable` option?
In the GUI go to Datacenter, Storage & choose the (correct) rbd & press Edit.
While you are there you can also check which Nodes have access.

Untitled.png
 
  • Like
Reactions: w453y and waltar
In the GUI go to Datacenter, Storage & choose the (correct) rbd & press Edit.
While you are there you can also check which Nodes have access.

View attachment 79266
Thanks!!! That solved the issue, None of the nodes were restricted, but KRBD was unchecked, enabling that fixed the issue and now I can create/restore containers in all nodes.

Btw, I'm curious to know what exactly went wrong and caused the issue?
 
  • Like
Reactions: waltar
For a better understanding you will have to research the topic of krbd vs librbd in ceph clients. (I'm not going down that rabbit-hole!)
If you want a Proxmox perspective on the subject maybe start here.

Since it appears you have solved your issue, Maybe mark this thread as solved. At the top of the thread, choose the Edit thread button, then from the (no prefix) dropdown choose Solved.
 
  • Like
Reactions: waltar and w453y

About

The Proxmox community has been around for many years and offers help and support for Proxmox VE, Proxmox Backup Server, and Proxmox Mail Gateway.
We think our community is one of the best thanks to people like you!

Get your subscription!

The Proxmox team works very hard to make sure you are running the best software and getting stable updates and security enhancements, as well as quick enterprise support. Tens of thousands of happy customers have a Proxmox subscription. Get yours easily in our online shop.

Buy now!