Submitted by petelawrence on Fri, 07/27/2012 - 03:17
Hello,
I have recently updated Virtualmin on 3 Ubuntu 10.04 servers to 3.93. I use the built in backup feature to backup virtual servers overnight to both the local disk and to an Amazon S3 bucket, and delete old backups files after a couple of days.
Since the upgrade however I've discovered that old backups are not being removed from either the local disk or from the S3 bucket, and the backups aren't being logged on the Backup Logs page. The backups are however being run, as the backup files are being created successfully.
This problem is occurring on all 3 servers.
Pete
Status:
Closed (fixed)
Comments
Submitted by andreychek on Fri, 07/27/2012 - 09:43 Comment #1
Howdy -- I just wanted to verify -- are you saying that the backups were correctly being deleted prior to the upgrade to Virtualmin 3.93?
Submitted by JamieCameron on Fri, 07/27/2012 - 10:46 Comment #2
It sounds like the backup may not be fully completing. If you run a backup from the web UI, does it complete OK? If not, at what point does it stop?
Submitted by petelawrence on Thu, 08/16/2012 - 16:45 Comment #3
Yes, these backups were all running fine prior to the upgrade to 3.93.
Here are the last few lines of output from running the backup manually (with domains and bucket name obfuscated):
Uploading archive to Amazon's S3 service .. .. upload failed! HTTP connection to .s3.amazonaws.com:443 for /20120816-0656/.tar.gz?uploads failed : Failed to lookup IP address for .s3.amazonaws.com
Backup failed : Failed to open /etc/webmin/virtual-server/backuplogs/1345103853-17991-1 for writing : Too many open files
This error occurs quite a way (around an hour) into the backing up to Amazon process.
Each of the servers have 70-100 domains on them.
Backing up an individual domain works fine.
If I backup only to local disk, the backup completes sucessfully.
Submitted by JamieCameron on Fri, 08/17/2012 - 00:22 Comment #4
Someone else reported a similar issue with Amazon DNS lookups - it looks like the cause is a lack of file descriptors on the system.
You might want to try increasing your system's file descriptor limit - see http://www.cyberciti.biz/faq/linux-increase-the-maximum-number-of-open-f... for documentation on how this can be done.
Submitted by petelawrence on Sat, 08/18/2012 - 00:46 Comment #5
Thanks for the assistance, increasing the user open file limit from 1024 to 4096 fixed the problem.
Submitted by JamieCameron on Sat, 08/18/2012 - 14:51 Comment #6
Great!
Submitted by Issues on Sat, 09/01/2012 - 15:08 Comment #7
Automatically closed -- issue fixed for 2 weeks with no activity.