Re: Duplicity Backup - check files uploaded to a backend against corruption using a checksum ?
<edgar.soldin <at> web.de>
2014-02-01 17:34:49 GMT
let's place this topic on the mailing list for others to find, shall we ;)
we already use checksums. "un"fortunately they are encrypted, in the manifest i think, so actual
distorted gpg files will cause a hickup with gpg decryption even before duplicity can detect any corruption.
wrt. to your approach. i'd rather have a more universal (woking with all backends) one, like the par2 backend
i am not sure what the status is on it though. Germar?
On 31.01.2014 23:51, Kostas Papadopoulos wrote:
> Hi folks,
> A feature which seems interesting, but I'm not quite sure if/how it could fit into duplicity, would be to
*check the files uploaded to the backend* *against corruption using a checksum* (in case of GoogleDrive a
MD5 checksum offered via the API). I realise that it doesn't totally fit into duplicity's "any dumb
backend" design, but a simple filesize+md5 would catch most errors ...
> I actually do this sort of check on any files I put on GoogleDrive via the Drive v2 API using the OAuth 2.0 Playground
> "originalFilename": "backup",
> "fileExtension": "",
> *"md5Checksum"**:** **"502e74a09ff18efa312a70427e613f97"**,*
> "fileSize": "67108864",
> "quotaBytesUsed": "67108864",