Calculate needed cache (s3-tech-preview)

ugf

New Member
Dec 8, 2023
17
6
3
Hi,

I am wondering if you can calculate the needed cache-size for proper S3.
Is there also a way to first write into cache and sync to s3 afterwards?
Also just pushing (similar to sync job) into S3 by using local storage would be nice, i think this can be done by creating a syncjob using the pbs itself as a target?

Right now, if S3 is slow, backup is slow. And if S3 is unavail, backup just stops/fails. (PBS is close, S3 is far. not ideally but ja)
Also: From my investigation the cache is using100GB, or maybe 10% of backup-storage used?

This is already a nice feature and running pretty stable! My home-connection is just garbage but PBS does not bother and all backups are intact =)
 
Hi,

I am wondering if you can calculate the needed cache-size for proper S3.
Is there also a way to first write into cache and sync to s3 afterwards?
Also just pushing (similar to sync job) into S3 by using local storage would be nice, i think this can be done by creating a syncjob using the pbs itself as a target?
That is already possible, you can use a local sync job to sync your snapshots from the filesystem backed to the s3 backed datastore.

Right now, if S3 is slow, backup is slow. And if S3 is unavail, backup just stops/fails. (PBS is close, S3 is far. not ideally but ja)
Also: From my investigation the cache is using100GB, or maybe 10% of backup-storage used?
The cache will grow to what size it has at it's disposal. Therefore, the recommendation is to use either a dedicated disk, partition or zfs dataset with quota, see https://pbs.proxmox.com/docs/storage.html#datastores-with-s3-backend. 100GB should already be plenty.
 
  • Like
Reactions: ugf