• If you are still using CentOS 7.9, it's time to convert to Alma 8 with the free centos2alma tool by Plesk or Plesk Migrator. Please let us know your experiences or concerns in this thread:
    CentOS2Alma discussion

/usr/local/psa/handlers/spool full of e-mails

josanl

New Pleskian
Hi.

Last week, we updated to version 12, and 2 days later, we had an error while rendering the plesk login page.


ERROR: Zend_Db_Statement_Exception: SQLSTATE[HY000]: General error: 1030 Got error 28 from storage engine (Pdo.php:234)


We found that the error was because there was no space in one partition. In a folder on that partition (/usr/local/psa/handlers/spool/) had accumulated thousands of emails, while the queue was empty.

If I check the status of the queue at this time:
# /var/qmail/bin/qmail-qstat
messages in queue: 2
messages in queue but not yet preprocessed: 0

In contrast, in the folder there are over 3000 emails.
# ls -la /usr/local/psa/handlers/spool |wc -l
3612

It is almost a case like this: http://forum.parallels.com/showthread.php?93655-usr-local-psa-handlers-spool-running-full


We think it's due to the upgrade to Plesk 12, because we have the same problem in 3 upgraded servers, but this problem does not occur in 2 servers that are not updated yet.

SO: CentOS 6.4 (Final)
Plesk: version 12.0.18
Mailhandler: Qmail

Thanks in advance!
Jose
 
We have exactly the some problem.
We updated to 12.0.8 version and now in every plesk server we have problems with the space.
The folder /usr/local/psa/handlers/spool is full.
 
Such issue with full directory /usr/local/psa/handlers/spool/ was reported before to our Service Team Request with ID PPPM-1766 was created. It was found that root cause in greylisting spam protection (Tools and Settings->Spam Filter Settings). There is no hotfix for this issue right now, the issue will take time form investigation by our developers. While as temporary workaround, you can disable Greylisting under Tools and Settings->Spam Filter Settings, I already performed this task. Or you can create daily cron task that clear /usr/local/psa/handlers/spool/ folder.
 
We have disable in every server Greylisting under Tools and Settings->Spam Filter Settings.
So, can we clear with a cron daily /usr/local/psa/handlers/spool/ without problems??
is /usr/local/psa/handlers/spool/ only a temp folder???
 
Yes, it is temporary files of messages which are not removed from '/usr/local/psa/handlers/spool/' directory if greylisting spam protection is enabled.
 
I have the same problem.I hope it would be fixed soon.

# /var/qmail/bin/qmail-qstat
messages in queue: 3
messages in queue but not yet preprocessed: 0

# ls -la /usr/local/psa/handlers/spool |wc -l
2155

# du -h --max-depth=1 /usr/local/psa/handlers/spool/
46G /usr/local/psa/handlers/spool/

Plesk 12.0.18
Centos CentOS 6.5 (Final)
Qmail
 
Same problem here after upgrading one server to Plesk 12.

Greylisting is disabled.

/usr/local/psa/handlers/spool is full of messages, several G's

I found that files there, are messages that cannot be delivered because they are too big for the destination server, so every file here is between 10-50Mb
 
Hi Igor,

Unfortunately, the isssue still exist. We had updated to the latest version but the spool folder is still filled up until it's full. Please advise.
 
Hi Igor,

Unfortunately, the isssue still exist. We had updated to the latest version but the spool folder is still filled up until it's full. Please advise.

Could you please provide listing of this directory? Also I would like to know which mail services do you have enabled there? qmail, postfix, GL, spamassassin, dovecot, courier, etc.
 
# ls -la /usr/local/psa/handlers/spool |wc -l
574

# ls -al | more
total 281052
drwxrwx--- 2 popuser popuser 835584 Jul 16 12:39 .
drwxr-xr-x 8 root root 4096 Jun 6 15:31 ..
-rw------- 1 qmaild popuser 8491 Jul 16 12:37 message0eUZ0r
-rw------- 1 qmaild popuser 461139 Jul 16 12:27 message0GaSf1
-rw------- 1 qmaild popuser 12044 Jul 16 12:34 message0HM86d
-rw------- 1 qmaild popuser 600995 Jul 16 12:15 message0lGpzR
-rw------- 1 qmaild popuser 60799 Jul 16 12:27 message0lutDu
-rw------- 1 qmaild popuser 1636 Jul 16 12:27 message0miuYx
-rw------- 1 qmaild popuser 10155 Jul 16 12:17 message0OI00E
-rw------- 1 qmaild popuser 2133 Jul 16 12:09 message0TCT5x
-rw------- 1 qmaild popuser 8802 Jul 16 12:01 message13Q3fO
-rw------- 1 qmaild popuser 384655 Jul 16 12:28 message1BrWh2
-rw------- 1 qmaild popuser 11124 Jul 16 12:37 message1fpu52
-rw------- 1 qmaild popuser 22765 Jul 16 12:04 message1mgWQT
-rw------- 1 qmaild popuser 611485 Jul 16 12:17 message1nY8h5
-rw------- 1 qmaild popuser 1120457 Jul 16 12:04 message1TiK5s
-rw------- 1 qmaild popuser 384655 Jul 16 12:32 message1yQyZ2
-rw------- 1 qmaild popuser 496329 Jul 16 12:03 message22pEUr
-rw------- 1 qmaild popuser 76659 Jul 16 12:38 message2dWHiC
-rw------- 1 qmaild popuser 496329 Jul 16 12:17 message2FX4jy
-rw------- 1 qmaild popuser 2654 Jul 16 12:38 message2GI3Ut
-rw------- 1 qmaild popuser 22765 Jul 16 12:36 message2H2iYe
-rw------- 1 qmails popuser 554755 Jul 16 12:26 message2HXyhh
-rw------- 1 qmaild popuser 384655 Jul 16 12:29 message2iLXqh
-rw------- 1 qmails popuser 554755 Jul 16 12:31 message2qph12
-rw------- 1 qmaild popuser 384655 Jul 16 12:37 message2TJ3Rb
-rw------- 1 qmaild popuser 11115 Jul 16 12:30 message2ZSngw
-rw------- 1 qmaild popuser 22064 Jul 16 12:11 message34VRdY
-rw------- 1 qmaild popuser 602588 Jul 16 12:33 message3cnyBP
-rw------- 1 qmaild popuser 17723 Jul 16 12:11 message3k08tC
-rw------- 1 qmaild popuser 18076 Jul 16 12:24 message3o0MIc
-rw------- 1 qmaild popuser 33387 Jul 16 12:06 message3PO3RB
-rw------- 1 qmaild popuser 632934 Jul 16 12:05 message3POqJc
-rw------- 1 qmaild popuser 496329 Jul 16 12:19 message3QcK6j
-rw------- 1 qmails popuser 554755 Jul 16 12:39 message3QzyqM
-rw------- 1 qmaild popuser 5541 Jul 16 12:02 message3ZvuUg
-rw------- 1 qmaild popuser 496329 Jul 16 12:26 message44kHcV
-rw------- 1 popuser popuser 31112 Jul 16 12:15 message45Ejfo
-rw------- 1 qmaild popuser 10155 Jul 16 12:00 message4DQoeE
-rw------- 1 qmaild popuser 35787 Jul 16 12:35 message4EF77N
-rw------- 1 qmaild popuser 11887 Jul 16 12:11 message4fmu3p
-rw------- 1 qmaild popuser 7044 Jul 16 12:27 message4KDRk4
-rw------- 1 qmaild popuser 134316 Jul 16 12:21 message4Nn7k5
-rw------- 1 qmaild popuser 3589 Jul 16 12:19 message4P2Ja9
-rw------- 1 qmaild popuser 480580 Jul 16 12:13 message4pxePT
-rw------- 1 qmaild popuser 22765 Jul 16 12:02 message4qyiYa
-rw------- 1 qmaild popuser 111315 Jul 16 12:13 message4TdsfZ
-rw------- 1 qmaild popuser 8491 Jul 16 12:20 message4u7TTx
-rw------- 1 qmaild popuser 10155 Jul 16 12:36 message4Wkzc9
-rw------- 1 qmaild popuser 384655 Jul 16 12:35 message4zm5bg
-rw------- 1 qmails popuser 554755 Jul 16 12:00 message5fiJ2k
-rw------- 1 qmaild popuser 496329 Jul 16 12:08 message5i0MpU
-rw------- 1 qmaild popuser 461139 Jul 16 12:30 message5LMqSk
-rw------- 1 qmaild popuser 40771 Jul 16 12:11 message5PMKPm
-rw------- 1 qmaild popuser 82746 Jul 16 12:20 message5Q8tlB
-rw------- 1 qmaild popuser 10155 Jul 16 12:15 message5sLwMc

We're using QMail.

Thank you for your response.
 
Back
Top