Virtualmin: Quota: Home on special Mountpoint: Wrong Quota edited


I think Virtualmin's Set-User-Quota inside the User Editor (not Webmin) does set quota in the wrong place if the home of the users is not on /.

This output is shown after setting a quota for user du@hoi.

Disk quotas for user du@hoi (uid 1059):
  Filesystem                   blocks       soft       hard     inodes     soft     hard
  /dev/vda1                         0       2048       2048          0        0        0
  /dev/vdc                         80          0          0         21        0        0

In this example my Virtualmin HOME is on /dev/vdc, mounted on /media/storage, while /dev/vda1 is on /.

Doing it manually via the Webmin module works.

Virtualmin knows that the Maildir is on: /media/storage/vm_userbase/hoi/homes/du/Maildir, as seen in the User-Edit-Mailbox Panel. This also is the case even if the Unix-User-HOME-Base is set to /media/storage/vm_userbase, too.

Webmin version 1.910
Usermin version 1.751
Virtualmin version 6.06-2

Best regards Manu

Closed (fixed)


If you SSH in and run virtualmin check-config , what does it say about which filesystem has quotas enabled?

It doesn't say about filesystems, only:
"Both user and group quotas are enabled for home and email directories."

My ext4 "drive" has no partition table but is a hooked-in virtio image, directly formatted. It seems in "Disk and Network FS" pane Webmin has problems identifying the PartitionID of that drive. Maybe this could be a negative check in the way?

I think comment #5 was the problem. I created a new virtio image, partitioned with gpt, fs ext4 and the same fstab options (usrquota,grpquota), rsynced everything over, exchanged the mount points and rebooted. After that I run "virtualmin fix-domain-quota --all-domains".

All quota usage is now visible in Virtualmin Info and the quotas are set on both / and /media/storage in parallel. Also Dovecot reports the right sizes.

I think it's working now. Thanks. Greetings, m

Status: Fixed » Closed (fixed)

Automatically closed - issue fixed for 2 weeks with no activity.