• The Horde webmail has been deprecated. Its complete removal is scheduled for April 2025. For details and recommended actions, see the Feature and Deprecation Plan.
  • We’re working on enhancing the Monitoring feature in Plesk, and we could really use your expertise! If you’re open to sharing your experiences with server and website monitoring or providing feedback, we’d love to have a one-hour online meeting with you.

Question "nginx -t" outputs "too many open files" when called by cron script, but not when called on console

Bitpalast

Plesk addicted!
Plesk Guru
Plesk 12.5.30, CentOS 7.2

Currently open file handles on system: 152,000
Max number of open file handles allowed for Nginx and system: 500,000

"nginx -t" tests fine, no errors, when run from the console, but the same command called from a script run by a cron job installed in Plesk responds:
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
nginx: [emerg] open() "/var/www/vhosts/system/[some domain]/logs/proxy_access_ssl_log" failed (24: Too many open files)
nginx: configuration file /etc/nginx/nginx.conf test failed

What is causing this behavior and can it be avoided?
 
Hi Peter Debik,

setting a specific "worker_rlimit_nofile XXXXXX;" - definition ( we use "worker_rlimit_nofile 64000;" - and don't experience the described issue ) at your global "nginx.conf" ( at the MAIN - part, before the "http" - section ), should answer your question. Basically, this global option is not set and has no default values, which may be inherited in case of no setting in your "nginx.conf". It overwrites as well the serverwide limits ( and user - defined limits ) in your "limits.conf".

You might as well being interested in reading the official NGINX - documentation at:

=> http://nginx.org/en/docs/ngx_core_module.html#worker_rlimit_nofile
 
Thank you. We use several individual settings to let Nginx handle 500,000 files, too, including your suggested setting. In the normal environment and when run from the console this works like a charm. But it does NOT work when the "/usr/sbin/nginx -t 2>&1" command is called by a PHP shell_exec() via crontab.

It seems that either in a PHP shell_exec() when run from a crontab job the "normal" settings for Nginx do not apply (e.g. because this is run form a shell that is unaware of the specific rlimit settings or maybe because running nginx -t that way might require a specific instruction where Nginx shall look for its own configuration files) or that the STDERR>STDOUT redirect in the command creates so many additional file handles that the error is thrown due to the stream output redirect. We've not yet figured it out and will continue to research this issue. We will next try to write the "nginx -t" output to a temporary file instead of STDERR or STDOUT, then read that file instead of reading the output stream. We'll report the solution once we got it.
 
System and Nginx open file descriptor values are all good. No issues during operations.
Tried to increase PHP max open file descriptor value of the /usr/bin/php installation, because the "nginx -t" command is called from within a PHP script - no result, error remains.
Tried to run as /usr/sbin/nginx -t -g "worker_rlimit_nofile 500000;" (that's more than enough) to explicitely tell Nginx to accept this high number of open file handles, just in case it does not read its default configuration when called from a PHP shell_exec() - no result, error remains.
Tried to not to read a stream result but to redirect the output into a file, so that issues with the STDERR>STDOUT redirect can be ruled out - no result, error remains.

It does work on a test host system identical to our production machines, but the test host system only has a few open file handles compared to production machines.

It is obviously impossible to read the "nginx -t" status through a shell_exec via PHP script highly frequented server, because Nginx always returns a "too many open files" message, while at the same time when it is run from console, everything is fine and the web server is up.

So this might have something to do with limits on the shell that PHP opens to execute commands from within a PHP script. I cannot think of any other explanation. Suggestions welcome.
 
Back
Top