Hey all,
Foremost, I am aware that backups on NFS shares aren't the way to go. This is experimental, but I'd still like to share this issue.
We're running a PBS that has an NFS Share on a Dell DD6400. As the PBS doesn't support the ddboost protocol, we're using the NFS feature to store backups to the DD6400.
We've had a previous issue where the verify job failed, but we weren't able to know what had broken the backups.
Today, after a week of successful backups with a verify OK, I launched Garbage collection on the datastore.
As a result, all the backups on that datastore are now failing the verify.
We're running this version of the PBS :
Here's the start of the failed verify log :
Somehow, either the GC or the DD6400 deleted backup chunk files.
As always, I can provide additional information if needed.
Cheers,
Taledo
Foremost, I am aware that backups on NFS shares aren't the way to go. This is experimental, but I'd still like to share this issue.
We're running a PBS that has an NFS Share on a Dell DD6400. As the PBS doesn't support the ddboost protocol, we're using the NFS feature to store backups to the DD6400.
We've had a previous issue where the verify job failed, but we weren't able to know what had broken the backups.
Today, after a week of successful backups with a verify OK, I launched Garbage collection on the datastore.
As a result, all the backups on that datastore are now failing the verify.
We're running this version of the PBS :
proxmox-backup-server 2.3.3-1 running version: 2.3.3
Here's the start of the failed verify log :
2023-03-16T12:00:00+01:00: Starting datastore verify job 'DD6400:v-77eb83f2-d34f'
2023-03-16T12:00:00+01:00: task triggered by schedule '*:0/30'
2023-03-16T12:00:00+01:00: verify datastore DD6400
2023-03-16T12:00:00+01:00: found 2 groups
2023-03-16T12:00:00+01:00: verify group DD6400:vm/20403 (14 snapshots)
2023-03-16T12:00:00+01:00: verify DD6400:vm/20403/2023-03-15T17:30:01Z
2023-03-16T12:00:00+01:00: check qemu-server.conf.blob
2023-03-16T12:00:00+01:00: check drive-scsi0.img.fidx
2023-03-16T12:00:08+01:00: can't verify chunk, load failed - store 'DD6400', unable to load chunk 'da0cd4e5c9756da3f46a9a14b016e37db227f34623107d2a42fa067cb5ee1eb2' - No such file or directory (os error 2)
2023-03-16T12:00:08+01:00: can't verify chunk, load failed - store 'DD6400', unable to load chunk '756bfb77821e7f535767bb45d83454efaa9ac769f30e65004693a14c039b0fe3' - No such file or directory (os error 2)
2023-03-16T12:00:08+01:00: can't verify chunk, load failed - store 'DD6400', unable to load chunk '02755e0ab8b3b43fe3c51a9f9a5f74d3b6d654a2abc8180e56b8a47edf6bc482' - No such file or directory (os error 2)
2023-03-16T12:00:08+01:00: can't verify chunk, load failed - store 'DD6400', unable to load chunk '5ec9b9c7411deb2de1f2a6bf102105c748163f9fb78c9f156fd18e47f55227f1' - No such file or directory (os error 2)
2023-03-16T12:00:08+01:00: can't verify chunk, load failed - store 'DD6400', unable to load chunk '4e8c0e02aa173841cad1044836ee2138158fd242260295c318312ed0e16f9d02' - No such file or directory (os error 2)
2023-03-16T12:00:08+01:00: can't verify chunk, load failed - store 'DD6400', unable to load chunk '0dd918e52743b70536de78889e7e414174574dfb58e343b9844536bde2bb662e' - No such file or directory (os error 2)
Somehow, either the GC or the DD6400 deleted backup chunk files.
As always, I can provide additional information if needed.
Cheers,
Taledo