Downloading large files reliably in PHP

I have a php script on a server to send files to recipents: they get a unique link and then they can download large files. Sometimes there is a problem with the transfer and the file is corrupted or never finishes. I am wondering if there is a better way to send large files

Code:

$f = fopen(DOWNLOAD_DIR.$database[$_REQUEST['fid']]['filePath'], 'r');
while(!feof($f)){
    print fgets($f, 1024);
}
fclose($f);

I have seen functions such as

http_send_file
http_send_data

But I am not sure if they will work.

What is the best way to solve this problem?

Regards
erwing


Solution 1:

Chunking files is the fastest / simplest method in PHP, if you can't or don't want to make use of something a bit more professional like cURL, mod-xsendfile on Apache or some dedicated script.

$filename = $filePath.$filename;

$chunksize = 5 * (1024 * 1024); //5 MB (= 5 242 880 bytes) per one chunk of file.

if(file_exists($filename))
{
    set_time_limit(300);

    $size = intval(sprintf("%u", filesize($filename)));

    header('Content-Type: application/octet-stream');
    header('Content-Transfer-Encoding: binary');
    header('Content-Length: '.$size);
    header('Content-Disposition: attachment;filename="'.basename($filename).'"');

    if($size > $chunksize)
    { 
        $handle = fopen($filename, 'rb'); 

        while (!feof($handle))
        { 
          print(@fread($handle, $chunksize));

          ob_flush();
          flush();
        } 

        fclose($handle); 
    }
    else readfile($path);

    exit;
}
else echo 'File "'.$filename.'" does not exist!';

Ported from richnetapps.com / NeedBee. Tested on 200 MB files, on which readfile() died, even with maximum allowed memory limit set to 1G, that is five times more than downloaded file size.

BTW: I tested this also on files >2GB, but PHP only managed to write first 2GB of file and then broke the connection. File-related functions (fopen, fread, fseek) uses INT, so you ultimately hit the limit of 2GB. Above mentioned solutions (i.e. mod-xsendfile) seems to be the only option in this case.

EDIT: Make yourself 100% that your file is saved in utf-8. If you omit that, downloaded files will be corrupted. This is, because this solutions uses print to push chunk of a file to a browser.

Solution 2:

If you are sending truly large files and worried about the impact this will have, you could use the x-sendfile header.

From the SOQ using-xsendfile-with-apache-php, an howto blog.adaniels.nl : how-i-php-x-sendfile/

Solution 3:

Best solution would be to rely on lighty or apache, but if in PHP, I would use PEAR's HTTP_Download (no need to reinvent the wheel etc.), has some nice features, like:

  • Basic throttling mechanism
  • Ranges (partial downloads and resuming)

See intro/usage docs.

Solution 4:

We've been using this in a couple of projects and it works quite fine so far:

/**
 * Copy a file's content to php://output.
 *
 * @param string $filename
 * @return void
 */
protected function _output($filename)
{
    $filesize = filesize($filename);

    $chunksize = 4096;
    if($filesize > $chunksize)
    {
        $srcStream = fopen($filename, 'rb');
        $dstStream = fopen('php://output', 'wb');

        $offset = 0;
        while(!feof($srcStream)) {
            $offset += stream_copy_to_stream($srcStream, $dstStream, $chunksize, $offset);
        }

        fclose($dstStream);
        fclose($srcStream);   
    }
    else 
    {
        // stream_copy_to_stream behaves() strange when filesize > chunksize.
        // Seems to never hit the EOF.
        // On the other handside file_get_contents() is not scalable. 
        // Therefore we only use file_get_contents() on small files.
        echo file_get_contents($filename);
    }
}