• Please be aware: Kaspersky Anti-Virus has been deprecated
    With the upgrade to Plesk Obsidian 18.0.64, "Kaspersky Anti-Virus for Servers" will be automatically removed from the servers it is installed on. We recommend that you migrate to Sophos Anti-Virus for Servers.
  • The Horde webmail has been deprecated. Its complete removal is scheduled for April 2025. For details and recommended actions, see the Feature and Deprecation Plan.
  • We’re working on enhancing the Monitoring feature in Plesk, and we could really use your expertise! If you’re open to sharing your experiences with server and website monitoring or providing feedback, we’d love to have a one-hour online meeting with you.

Scheduled Task: executing php -f ... does not work anymore

Fritz MichaelG

Basic Pleskian
Under a particular domain, I had created the following scheduled task:
Code:
php -f /var/www/vhosts/somedomain/httpdocs/foo.php
This worked fine, up until exactly 2013-10-29 11:22 UTC+1. Then I constantly got the following error:
Code:
Could not open input file: /var/www/vhosts/somedomain/httpdocs/foo.php
Back then I logged into the shell and executed the command directly from the shell, to check whether it should work... and it did. The command kept failing though with that error message in the scheduled task. Now, over a month later, the error message changed to:
Code:
-: php: command not found
Still, the command is working fine directly from the shell.

I don't know what's going on, how can I execute php scripts from a scheduled tasks now? What has changed?
 
use the full path to the php bin if that can help...you can get it with which php
 
use the full path to the php bin if that can help...you can get it with which php
Good point, but no luck with that. According to
Code:
which php
the path is /usr/bin/php. However, when using that command in the scheduler, I just get:
Code:
-: /usr/bin/php: No such file or directory
 
Ok, now I get it, I think. Under Home > Subscriptions > domain > Websites & Domains > FTP Access the system user must have access to an appropriate shell. i.e. Forbidden, chrooted and /bin/rbash won't work, as those shells have no access to /usr/bin/php (not sure about /bin/bash and /bin/dash). Now I am back to my old error message:
Code:
Could not open input file: /var/www/vhosts/somedomain/httpdocs/foo.php
, but I think I can explain that as well. The script I want to run is the cron.php from ownCloud. However, my ownCloud installation is not running in CGI or FastCGI mode (since that didn't work), instead PHP is running as an Apache module. This in turn means, that I have set the owner of all files to www-data:www-data (the Apache User) which in turn means that the system user for that account cannot access these files, depending on the file permissions.

I am not sure how to circumvent this problem, other than setting the file permissions to 777 or trying to get the ownCloud to run under FastCGI again. I don't really understand why it was working before either - the file ownership was already set to www-data:www-data when the scheduled task was still working.
 
I do not like enabling any sort of shell for a hosting user, even if it is "me".

Does this script not allow you to wget or curl foo.php instead of running it directly? I so dislike scripts that want you to run things directly, because they are not suited to a shared hosting environment.

Alternatively, how about adding the cron task to the root crontab via the scheduled tasks in the admin side of things (Tools & Settings > Scheduled Tasks > select root, enter your task. Actually, now that I think about it, is there an apache option in the user drop-down? That might be better and a lot safer.

Be aware, however, that there are dangers doing things as root!
 
I do not like enabling any sort of shell for a hosting user, even if it is "me".
I get what you mean. And in this case, even a rooted shell is not sufficient, since it doesn't provide access to the php executable.


Does this script not allow you to wget or curl foo.php instead of running it directly? I so dislike scripts that want you to run things directly, because they are not suited to a shared hosting environment.
No, it's perfectly possible to trigger it with wget or whatever instead. I just prefer it the other way, for no particular reason in this case.


Alternatively, how about adding the cron task to the root crontab via the scheduled tasks in the admin side of things (Tools & Settings > Scheduled Tasks > select root, enter your task. Actually, now that I think about it, is there an apache option in the user drop-down? That might be better and a lot safer.

Be aware, however, that there are dangers doing things as root!
Yeah, I don't really like to run it as root though, for security reasons.
 
Back
Top