These forums are locked and archived, but all topics have been migrated to the new forum. You can search for this topic on the new forum: Search for Storing large backup files on Amazon S3 on the new forum.
Hello,
I'm currently using the latest Virtualmin GPL (3.80) to host a number of sites. All of the sites are backed up using the built-in scheduled backup feature. I then use the s3cmd program to sync these backups with Amazon S3. This has been working fine for a while now, but one of the backup files is now larger than the 5GB limit imposed by Amazon, so will not be accepted by Amazon. Is there a way to get Virtualmin to split backup files into multiple chunks? Or is this a feature in Virtualmin Pro?
Thanks in advance, Pete
Hi Pete,
Unfortunately, even the Pro version doesn't contain a mechanism for breaking the backup files into chunks now. However, someone had mentioned that not too long ago, and I do believe it's on Jamie's todo list.
-Eric
Maybe You could use duplicity. I am using it with additional DUPLY helper
http://blog.damontimm.com/how-to-automated-secure-encrypted-incremental-...
Kind regards, Piotr