Yeah, I know, "how long is a piece of string". But given fixed hardware, and given a relatively fixed dataset to be backed up, lets say the initial backup takes a time period of "1" to create. If I backed up again later, and nothing had changed, how long (in relation to 1) would the first incremental backup take?
I'm aware that PBS does clever things with reusing chunks to keep the overall size down. So I'm guessing there's a degree of re-evaluating to do, even if the data is almost the same. But I'm banking on the increment being comfortably below 0.5 - is that realistic?
I'm running my first backup now, it's taking a very long time, it's probably going to be over 36 hours. I'm not so bothered about that, so long as the increments are quicker. It is a relatively static dataset so it's going to be evaluation rather than writing. If it's going to be > 12 hours each time, I'm going to need to rethink my strategy.
I'm aware that PBS does clever things with reusing chunks to keep the overall size down. So I'm guessing there's a degree of re-evaluating to do, even if the data is almost the same. But I'm banking on the increment being comfortably below 0.5 - is that realistic?
I'm running my first backup now, it's taking a very long time, it's probably going to be over 36 hours. I'm not so bothered about that, so long as the increments are quicker. It is a relatively static dataset so it's going to be evaluation rather than writing. If it's going to be > 12 hours each time, I'm going to need to rethink my strategy.