Hi,
I am running some Tests on syncing a NAS onto PBS via PBC. I already run --verbose, but the Stream always abruütly ends on
...
append chunks list len (64)
append chunks list len (64)
append chunks list len (64)
Killed
PVE 6.4.13
PBS 1.1-12
PBC 1.1.10
Apt tells me everything is up to date.
I could successfully sync one share, but anotherone is failing.
My LXC with Debian buster has 1 core and 512MB RAM.
May it be thatz my LXC buffers the file before writing and then the RAM gets killed? But that would mean I have to have RAM above 100GB for transferring big files.
any suggestions?
PBS says
021-08-19T13:54:30+02:00: POST /dynamic_chunk
2021-08-19T13:54:30+02:00: upload_chunk done: 1444507 bytes, c6aa9b12632c19f8d329e206793e805bc5618e20fd3385d367bb7153f27e81bf
2021-08-19T13:54:30+02:00: upload_chunk done: 6053413 bytes, f92101454d7cfb4dbcee80d63acc4685448087ff58bcf70ae79cfd35142092ef
2021-08-19T13:54:30+02:00: upload_chunk done: 3293893 bytes, bbf873cdb895699a238b5db321e373134087546731190c83d85fbd8054fb3f83
2021-08-19T13:54:30+02:00: backup failed: connection error: Connection reset by peer (os error 104)
2021-08-19T13:54:30+02:00: removing failed backup
2021-08-19T13:54:30+02:00: TASK ERROR: connection error: Connection reset by peer (os error 104)
2021-08-19T13:54:30+02:00: POST /dynamic_chunk: 400 Bad Request: error reading a body from connection: broken pipe
2021-08-19T13:54:30+02:00: POST /dynamic_chunk: 400 Bad Request: error reading a body from connection: broken pipe
2021-08-19T13:54:30+02:00: POST /dynamic_chunk: 400 Bad Request: backup already marked as finished.
I am running some Tests on syncing a NAS onto PBS via PBC. I already run --verbose, but the Stream always abruütly ends on
...
append chunks list len (64)
append chunks list len (64)
append chunks list len (64)
Killed
PVE 6.4.13
PBS 1.1-12
PBC 1.1.10
Apt tells me everything is up to date.
I could successfully sync one share, but anotherone is failing.
My LXC with Debian buster has 1 core and 512MB RAM.
May it be thatz my LXC buffers the file before writing and then the RAM gets killed? But that would mean I have to have RAM above 100GB for transferring big files.
any suggestions?
PBS says
021-08-19T13:54:30+02:00: POST /dynamic_chunk
2021-08-19T13:54:30+02:00: upload_chunk done: 1444507 bytes, c6aa9b12632c19f8d329e206793e805bc5618e20fd3385d367bb7153f27e81bf
2021-08-19T13:54:30+02:00: upload_chunk done: 6053413 bytes, f92101454d7cfb4dbcee80d63acc4685448087ff58bcf70ae79cfd35142092ef
2021-08-19T13:54:30+02:00: upload_chunk done: 3293893 bytes, bbf873cdb895699a238b5db321e373134087546731190c83d85fbd8054fb3f83
2021-08-19T13:54:30+02:00: backup failed: connection error: Connection reset by peer (os error 104)
2021-08-19T13:54:30+02:00: removing failed backup
2021-08-19T13:54:30+02:00: TASK ERROR: connection error: Connection reset by peer (os error 104)
2021-08-19T13:54:30+02:00: POST /dynamic_chunk: 400 Bad Request: error reading a body from connection: broken pipe
2021-08-19T13:54:30+02:00: POST /dynamic_chunk: 400 Bad Request: error reading a body from connection: broken pipe
2021-08-19T13:54:30+02:00: POST /dynamic_chunk: 400 Bad Request: backup already marked as finished.
Last edited: