Calculate it yourself. Lets say this gives you 400 IOPS and you are using 8x 20TB HDDs. 80TB of backups would mean ~40 million chunk files spread in random order across the disks and when running a GC or full-verify each single chunk file needs to be randomly read + written. Just think of how fast it would be to sha512sum+touch 80TBs of bigger jpgs or smaller mp3s...thats the performance you can expect.
Last edited: