Large file stops FTP working

3 posts / 0 new
Last post
#1 Wed, 07/30/2014 - 16:36
applejack

Large file stops FTP working

I uploaded a large file over 1GB and it has stopped FTP uploading any new files across the entire server. I can change and upload existing files. There is plenty of space left for that particular owners virtual server. I did ulimit -a and the fsize is set to unlimited. This is running on CentOS 5. Any ideas as to why this is happening and how to rectify this?

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 128362
max locked memory       (kbytes, -l) 32
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 10240
cpu time               (seconds, -t) unlimited
max user processes              (-u) 128362
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
Wed, 07/30/2014 - 17:37
andreychek

Howdy,

Hmm, I hadn't heard of that happening before. Do you see any errors or related messages in either /var/log/messages, or in any of the logs in /var/log/proftpd?

-Eric

Wed, 07/30/2014 - 20:30 (Reply to #2)
applejack

sftp-server[17993]: error: process_write: write failed

It writes an empty file. I found that if I change the group owner of the large file then ftp works. The ftp user was the same owner.

Topic locked