Force Download large files with PHP

So you’ve been able to create zips on the fly with php but now you need to allow them to be downloaded. Lets assume that you don’t want to disclose the whereabouts on the server the files reside and what their filenames are but you want them to be downloaded by the masses. I hear you saying “surely thats impossible”, but you couldn’t be any further from the truth.

Using the header function and the content-type attribute can allow you to output a file in its own context. By default you could output to the page the following:

header("Content-Description: File Transfer");
header("Content-Disposition: attachment; filename=your_specified_filename.ext");
header("Content-Type: application/octet-stream");
header("Content-Transfer-Encoding: binary");
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
$path = $_SERVER['DOCUMENT_ROOT']."path/to/file/filename.ext";
$fp = fopen($path, 'rb');
fpassthru($fp);

And that would output your file to the user with your specified filename for the file. Please note the “Content-Type” line. I have used “octet-stream” in this example which will cover the majority of files you would pass to be downloaded, if you were dynamically passing a number of different extensions. If you were only allowing PDF’s for example, you could just use:

header("Content-Type: application/pdf");

This would render all files passed to the page as a PDF. Obviously if you were to pass a JPG to that page, the PDF would be highly corrupt and would not run. A full list of MIME types you could use can be found on Wiki – MIME types.

The above on a general server will allow a file of about 40Mb to be downloaded before the server will timeout for security reasons. Recently I had a client with this problem. The zips I was creating for him was containing a random number of high res images which resulted in some zips being almost double the limit. Given that his clients still needed to be able to obtain these created files, a solution needed to be found. I COULD have just disclosed the full path of the files, would would have been very long, and messy, and with the way I was storing these files in the specific naming convention I had so that no zip was overwritten, the filename was very unreadable to the user.

After speaking to a very clued up server monkey on support for the client, he notified me of a header that allowed the server to do the processing of the file instead of the browser which removed the timeout issue. Unfortunately this setting is not normally turned on by default, but if you have a standalone or VPS server that your able to change settings on, this is THE perfect solution for you. Just add ONE more header line:

header('X-Sendfile: $path');

And suddenly you are passing files of any size to the user with no timeout troubles.

Now to set this up you can either talk to your hosting providers support team that should be able to help you out in enabling ‘mod_xsendfile’ or you can add some code to your .htaccess file:

<ifModule mod_xsendfile.c>
XSendFile On
XSendFilePath /home/$USER/public_html/folder/
</ifModule>

Again, depending on your server setup, this may or may not work. When in doubt, contact your support team.

Leave a Reply

Your email address will not be published. Required fields are marked *