Using an Amazon AWS S3 Bucket as backup storage

There is an open issue for this on our bug tracker [1], but no, currently this isn't supported. You may be able to use something s3fs-fuse to add this functionality [2]. However, be aware that there have been reports of issues with this approach, and we do not officially support that. So there are no guarantees about the stability of such a setup.

[1]: https://bugzilla.proxmox.com/show_bug.cgi?id=2943
[2]: https://github.com/s3fs-fuse/s3fs-fuse
 
  • Like
Reactions: entwicklung_jcd
I manage IT for a small company (Proxmox PVE with 1TB NVMe disks + PBS with 2TB HDDs) and 2TB on iDRIVE S2 cloud.
The StarWind VTL is free to use? If it isn't how much it cost? There is no price online!

I can see the "VTL free" option and comparasion to commercial product but cannot understand the differences: local storage has a minus and cache has a 4 tape limit. Can you elaborate about this?

I'd like to understand how the VTL will offload to S3: can I do a local backup using VTL and later during the day it upload to S3 compatible storage?
Or this is LOCAL STORAGE (not present on VTL free ) so all backup will to be directly on cloud (and slooooow)?

Thanks

Jáder
 
Hey @jader,

StarWind VTL has free version. It has limitation on 4 local tapes (it is called cache), so you can have 4 local tapes at a time for backups, which can then be offloaded to cloud. You can do a local backup and later upload to cloud.
Free version is managed via powershell (script sapmles can be found here: C:\Program Files\StarWind Software\StarWind\StarWindX\Samples\powershell) and can be monitored via GUI.
Should help: https://www.starwindsoftware.com/re...tual-tape-library-deploy-using-cloud-storage/

Best regards,
Alex
 
a new project that attempts to act as proxy to S3:

https://github.com/tizbac/pmoxs3backuproxy
I'm sorry if I misunderstand your project, but I'd like to know if it allow us to use an S3 bucket as target for Proxmox backup.
I'd like to use iDrive (AWS competitor , much cheaper, with S3 protocol access too, to store replica from backups done localy.
So primary backup is done onsite and later exported/sync to iDrive using S3.
I'd like to understand how to do it ... and later I can create very detailed documentation to all other users.
 
You could always move your backups from a local drive to AWS using Rclone and run it as a cron job, if you are comfortable working from the command line. Rclone installs from a simple script https://rclone.org/install/
 
Last edited:
You could always move your backups from a local drive to AWS using Rclone and run it as a cron job, if you are comfortable working from the command line. Rclone installs from a simple script https://rclone.org/install/
How could I copy backups... should I upload entire tree for iDrive?
I'm using RCLONE on small installations with no PBS, but so it do a FULL BACKUP every day, and cost me all size on storage of iDrive.
If we can use PBS saving to S3, it would do incremental backups, so first would be 250GB (one client) and all others would be less than 200MB (doc e xls files most!).
It would be nicer to have a professional solution like PBS using a standand S3 protocol to create a powerfull solution.
 
I have used regular PVE backups and rclone them to iDrive (a cheaper S3 provider) for years, works fine and it's nice.
BUT only if your backup size is small, because this way all backups are FULL... so 250GB a day is a long upload EACH day... I'd prefer to use PBS and differential backups (a lot less to upload!)