Is planned to have possibility define chunk size? Due to small chunk size, remote sync is really slow.
On 300Mbps line, sync was reaching max 28Mbps (latency 23ms).
PBS client/server: 0.8.13
Backup to local PBS -> 14s:
Sync PBS to remote -> almost 3m39s (around 1 chunk per second):
With this sync speed, I'll be syncing my initial backup few days.
Thanks
On 300Mbps line, sync was reaching max 28Mbps (latency 23ms).
PBS client/server: 0.8.13
Backup to local PBS -> 14s:
Code:
2020-09-03T16:15:55+02:00: starting new backup on datastore 'HOME-PVE': "ct/666252/2020-09-03T14:15:55Z"
2020-09-03T16:15:55+02:00: GET /previous: 400 Bad Request: no previous backup
2020-09-03T16:15:55+02:00: add blob "/mnt/pbs-backup/local/home-pve/ct/666252/2020-09-03T14:15:55Z/pct.conf.blob" (391 bytes, comp: 391)
2020-09-03T16:15:55+02:00: created new dynamic index 1 ("ct/666252/2020-09-03T14:15:55Z/catalog.pcat1.didx")
2020-09-03T16:15:55+02:00: created new dynamic index 2 ("ct/666252/2020-09-03T14:15:55Z/root.pxar.didx")
2020-09-03T16:16:09+02:00: Upload statistics for 'root.pxar.didx'
2020-09-03T16:16:09+02:00: UUID: 3c1b06de027a4708822903bfd989439d
2020-09-03T16:16:09+02:00: Checksum: 94e122b816f18fb0496046794937eb1534d7b3fd1c4ab435b8507982ad9c32db
2020-09-03T16:16:09+02:00: Size: 1433917065
2020-09-03T16:16:09+02:00: Chunk count: 372
2020-09-03T16:16:09+02:00: Upload size: 1402519050 (97%)
2020-09-03T16:16:09+02:00: Duplicates: 12+0 (3%)
2020-09-03T16:16:09+02:00: Compression: 46%
2020-09-03T16:16:09+02:00: successfully closed dynamic index 2
2020-09-03T16:16:09+02:00: Upload statistics for 'catalog.pcat1.didx'
2020-09-03T16:16:09+02:00: UUID: ea97fa3f6af84f98bc6691d4751809b3
2020-09-03T16:16:09+02:00: Checksum: 66593bbfea7831b6a113f7768b49205a661aac20b76565f70167cb8537dfe3a6
2020-09-03T16:16:09+02:00: Size: 681797
2020-09-03T16:16:09+02:00: Chunk count: 2
2020-09-03T16:16:09+02:00: Upload size: 681797 (100%)
2020-09-03T16:16:09+02:00: Compression: 42%
2020-09-03T16:16:09+02:00: successfully closed dynamic index 1
2020-09-03T16:16:09+02:00: add blob "/mnt/pbs-backup/local/home-pve/ct/666252/2020-09-03T14:15:55Z/index.json.blob" (412 bytes, comp: 412)
2020-09-03T16:16:09+02:00: successfully finished backup
2020-09-03T16:16:09+02:00: backup finished successfully
2020-09-03T16:16:09+02:00: TASK OK
Sync PBS to remote -> almost 3m39s (around 1 chunk per second):
Code:
2020-09-03T16:18:53+02:00: Starting datastore sync job 'pbs1-home-pve'
2020-09-03T16:18:53+02:00: Sync datastore 'PBS1-HOME-PVE' from 'vm-pbs1/HOME-PVE'
2020-09-03T16:18:53+02:00: sync snapshot "ct/666252/2020-09-03T14:15:55Z"
2020-09-03T16:18:53+02:00: sync archive pct.conf.blob
2020-09-03T16:18:53+02:00: sync archive root.pxar.didx
2020-09-03T16:22:32+02:00: sync archive catalog.pcat1.didx
2020-09-03T16:22:32+02:00: got backup log file "client.log.blob"
2020-09-03T16:22:32+02:00: sync snapshot "ct/666252/2020-09-03T14:15:55Z" done
2020-09-03T16:22:32+02:00: delete vanished group 'ct/666251'
2020-09-03T16:22:32+02:00: sync job 'pbs1-home-pve' end
2020-09-03T16:22:32+02:00: TASK OK
With this sync speed, I'll be syncing my initial backup few days.
Thanks