• Please be aware: Kaspersky Anti-Virus has been deprecated
    With the upgrade to Plesk Obsidian 18.0.64, "Kaspersky Anti-Virus for Servers" will be automatically removed from the servers it is installed on. We recommend that you migrate to Sophos Anti-Virus for Servers.
  • The Horde webmail has been deprecated. Its complete removal is scheduled for April 2025. For details and recommended actions, see the Feature and Deprecation Plan.

Issue (Urgent) Can't download files >1GB served by PHP

massimo

New Pleskian
Hi everyone,
I am facing a quite urgent issue here, because all the downloads of files >1GB served by PHP via HTTP are failing exactly at 1GB.
I've searched online but nothing seems to solve the issue.

Can anybody help?

Massimo.
 
This will probably need some extra web server configurations. I suggest to start with an Nginx directive in "Additional Apache and Nginx directives"

proxy_max_temp_file_size <desired size>

for example

proxy_max_temp_file_size 4096m
to allow files up to 4 GB to be delivered by Nginx.
 
This will probably need some extra web server configurations. I suggest to start with an Nginx directive in "Additional Apache and Nginx directives"

proxy_max_temp_file_size <desired size>

for example

proxy_max_temp_file_size 4096m
to allow files up to 4 GB to be delivered by Nginx.
Thanks, did try that earlier, didn't solve. Should "Serve static files directly by nginx" be enabled as well?
 
Before you spend hours over hours figuring out what file this is, from what server it is actually served, what configuration must be done to make this work: Can't you just simply download it via FTP? Just asking, because web servers are just not made to deliver "pages" or other resources like files that are so extremely big.

Else, if you absolutely want your web server to deliver very large files, you will need to check Google for all the directives that this needs. There can be issues in PHP, in Apache and in Nginx. It can be issues with the maximum allowed block size, maximum allowed file size, the RAM that this operation needs, the timeout values in PHP (e.g. the script run time must not expire before a script has completed content delivery) etc. Simply walk through all parts involved to find a solution. I doubt that it will be possible to go through all these different aspects online in a forum.
 
Before you spend hours over hours figuring out what file this is, from what server it is actually served, what configuration must be done to make this work: Can't you just simply download it via FTP? Just asking, because web servers are just not made to deliver "pages" or other resources like files that are so extremely big.

Else, if you absolutely want your web server to deliver very large files, you will need to check Google for all the directives that this needs. There can be issues in PHP, in Apache and in Nginx. It can be issues with the maximum allowed block size, maximum allowed file size, the RAM that this operation needs, the timeout values in PHP (e.g. the script run time must not expire before a script has completed content delivery) etc. Simply walk through all parts involved to find a solution. I doubt that it will be possible to go through all these different aspects online in a forum.
Thanks Peter.
Nginx's client_max_body_size is already set to 2000m (since my site handles large file uploads).

Anyways it seems to be solved after I set this PHP directive in php.ini:
output_buffering = 0 (instead of 4096)
Does it make sense to you?

Anyways my PHP download script is already handling streamed download so I don't really understand where's the issue with big files.
(I am using Laravel framework: File Storage - Laravel - The PHP Framework For Web Artisans)

I might as well try using FTP and see if that's gonna work better.

Thanks for your kind reply and help!

Best,
Massimo.
 
Back
Top