hi,
how do you guys solve the backup space at standard hosting provider sites?
Well, there you just have a ftp share, where you can backup, but you do not have enough space to buffer the backups on the actual running host.
I wanted to write the vzdump output directly onto the ftp share.
my approach was using curlftpfs but it writes everythin into memory, so this is not an option.
do you have any suggestions?
edit: maybe you could add a --max-size parameter that uses split to geht e.g. 50MB files that can be copied separately?
how do you guys solve the backup space at standard hosting provider sites?
Well, there you just have a ftp share, where you can backup, but you do not have enough space to buffer the backups on the actual running host.
I wanted to write the vzdump output directly onto the ftp share.
my approach was using curlftpfs but it writes everythin into memory, so this is not an option.
do you have any suggestions?
edit: maybe you could add a --max-size parameter that uses split to geht e.g. 50MB files that can be copied separately?
Last edited: