Передача больших файлов php

Streaming a large file using PHP

I have a 200MB file that I want to give to a user via download. However, since we want the user to only download this file once, we are doing this:

echo file_get_contents('http://some.secret.location.com/secretfolder/the_file.tar.gz'); 

to force a download. However, this means that the whole file has to be loaded in memory, which usually doesn’t work. How can we stream this file to them, at some kb per chunk?

Use stream_copy_to_stream(fopen(‘file.ext’, ‘rb’)), STDOUT) to pipe the stream to stdout. If your default buffer size needs adjusting, use stream_set_chunk_size($fp, $size)

5 Answers 5

 while (!feof($handle)) < $buffer = fread($handle, CHUNK_SIZE); echo $buffer; ob_flush(); flush(); if ($retbytes) < $cnt += strlen($buffer); >> $status = fclose($handle); if ($retbytes && $status) < return $cnt; // return num. bytes delivered like readfile() does. >return $status; > // Here goes your code for checking that the user is logged in // . // . if ($logged_in) < $filename = 'path/to/your/file'; $mimetype = 'mime/type'; header('Content-Type: '.$mimetype ); readfile_chunked($filename); >else < echo 'Tabatha says you haven\'t paid.'; >?> 

Found this post years later — I’m curious about the ‘Tabatha says you haven\’t paid.’ reference, a Google shows it’s been used a lot in this script but not where it’s from, does anyone know?

Use fpassthru() . As the name suggests, it doesn’t read the entire file into memory prior to sending it, rather it outputs it straight to the client.

Modified from the example in the manual:

If you would rather stream the content directly to the browser rather than a download (and if the content type is supported by the browser, such as video, audio, pdf etc) then remove the Content-Disposition header.

@Bludream Not in the current form. To handle resuming of downloads you should check for the client header Range . If a bytes value is present you can then use fseek() on the file, and send an appropriate Content-Range header before sending it.

I don’t recommend this for large files, as the original question states. Smaller files work fine, but I’m using fpassthru on a large file and my download died because «allowed memory size was exhausted».

@Magmatic Yes, as the manual says for fpassthru it will output the file to the output buffer. But if you call ob_end_flush() first, then output buffering is disabled, so you won’t hit your memory limit 😀

Take a look at the example from the manual page of fsockopen() :

$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30); if (!$fp) < echo "$errstr ($errno)
\n"; > else < $out = "GET / HTTP/1.1\r\n"; $out .= "Host: www.example.com\r\n"; $out .= "Connection: Close\r\n\r\n"; fwrite($fp, $out); while (!feof($fp)) < echo fgets($fp, 128); >fclose($fp); >

This will connect to www.example.com , send a request then get and echo the response in 128 byte chunks. You may want to make it more than 128 bytes.

How is this tranfering a file. Especially if the file is outside the webroot that solution will not work.

Correct me if I’m wrong but using raw sockets forces you to implement yourself every HTTP feature you’re faced to, such as redirections, compression, encryption, chunked encoding. It might work in specific scenarios but it isn’t the best general purpose solution.

Читайте также:  Php mysqli query limit

And it works very well for me

 path = $filePath; > /** * Open stream */ private function open() < if (!($this->stream = fopen($this->path, 'rb'))) < die('Could not open stream for reading'); >> /** * Set proper header to serve the video content */ private function setHeader() < ob_get_clean(); header("Content-Type: video/mp4"); header("Cache-Control: max-age=2592000, public"); header("Expires: ".gmdate('D, d M Y H:i:s', time()+2592000) . ' GMT'); header("Last-Modified: ".gmdate('D, d M Y H:i:s', @filemtime($this->path)) . ' GMT' ); $this->start = 0; $this->size = filesize($this->path); $this->end = $this->size - 1; header("Accept-Ranges: 0-".$this->end); if (isset($_SERVER['HTTP_RANGE'])) < $c_start = $this->start; $c_end = $this->end; list(, $range) = explode('=', $_SERVER['HTTP_RANGE'], 2); if (strpos($range, ',') !== false) < header('HTTP/1.1 416 Requested Range Not Satisfiable'); header("Content-Range: bytes $this->start-$this->end/$this->size"); exit; > if ($range == '-') < $c_start = $this->size - substr($range, 1); >else < $range = explode('-', $range); $c_start = $range[0]; $c_end = (isset($range[1]) && is_numeric($range[1])) ? $range[1] : $c_end; >$c_end = ($c_end > $this->end) ? $this->end : $c_end; if ($c_start > $c_end || $c_start > $this->size - 1 || $c_end >= $this->size) < header('HTTP/1.1 416 Requested Range Not Satisfiable'); header("Content-Range: bytes $this->start-$this->end/$this->size"); exit; > $this->start = $c_start; $this->end = $c_end; $length = $this->end - $this->start + 1; fseek($this->stream, $this->start); header('HTTP/1.1 206 Partial Content'); header("Content-Length: ".$length); header("Content-Range: bytes $this->start-$this->end/".$this->size); > else < header("Content-Length: ".$this->size); > > /** * close curretly opened stream */ private function end() < fclose($this->stream); exit; > /** * perform the streaming of calculated range */ private function stream() < $i = $this->start; set_time_limit(0); while(!feof($this->stream) && $i end) < $bytesToRead = $this->buffer; if(($i+$bytesToRead) > $this->end) < $bytesToRead = $this->end - $i + 1; > $data = fread($this->stream, $bytesToRead); echo $data; flush(); $i += $bytesToRead; > > /** * Start streaming video content */ function start() < $this->open(); $this->setHeader(); $this->stream(); $this->end(); > > 

To use this class, you will have to write simple code like as below:

$stream = new VideoStream($filePath); $stream->start(); 

Источник

Uploading a file larger than 2GB using PHP

I’m trying to upload a file larger than 2GB to a local PHP 5.3.4 server. I’ve set the following server variables:

memory_limit = -1 post_max_size = 9G upload_max_filesize = 5G 

PHP Warning: POST Content-Length of 2120909412 bytes exceeds the limit of 1073741824 bytes in Unknown on line 0

HTTP is really not the right choice of protocol for uploading a 2GB file. You should be using (S)FTP for this.

Have you verified those are the variables in use? (ie, through phpinfo() ) PHP never stops surprising me about which config file it is actually reading. (Also, HTTP is so not meant for this. )

Why are you wanting to use PHP for this instead of something like FTP or any number or other ways of uploading files

Basically I’m using some software written in Adobe AIR to upload a file. AIR is sending the file to the PHP server which uploads and allocates the file to a record. Is there no way to achieve this in PHP?

@Thanatos: Yeah I can see the variables in phpinfo() and they are correct. If I set the limit to >10G, the error states the limit is a negative number instead.

6 Answers 6

I had a similar problem, but my config was:

post_max_size = 1.8G upload_max_filesize = 1.8G 

and yet I could not upload a 1.2GB file. The error was very same:

PHP Warning: POST Content-Length of 1347484420 bytes exceeds the limit of 1073741824 bytes in Unknown on line 0 

I spent a day wondering where the heck was this «limit of 1073741824» coming from!

Actually, the error was in the php.ini parser: It only understands INTEGER numbers, so essentially it was parsing 1.8G as 1G !!

Changing the value to e.g. 1800M fixed it.

Pls ensure to restart the apache server with the below command service apache2 restart

Читайте также:  Microsoft access and html

I’ve been hours trying to find out what could be the issue with a owncloud install for uploading big files, and this was the issue after all.

I don’t know about in 5.3.x, but in 5.2.x there are some int/long issues in the PHP code. even if you’re on a 64-bit system and have a version of PHP compiled with 64-bit, there are several problems.

First, the code that converts post_max_size and others from ascii to integer stores the value in an int, so it converting «9G» and putting the result into this int will bork the value because 9G is a larger number than a 32-bit variable can hold.

But there are also several other areas of PHP code that are used with the Apache module, CGI, etc. that need to be changed from int to long.

So. for this to work, you need to edit the PHP code and compile it by hand (make sure you compile it as 64-bit). here’s a link to a list of diffs:

The file above is a diff on 5.2.10 code, but I just made the changes by hand to 5.2.17 code and i just uploaded a 3.4gb single file through apache/php (which hadn’t worked before the change).

Источник

Отдаем файлы эффективно с помощью PHP

Метод хорош тем, что работает с коробки. Надо только написать свою функцию отправки файла (немного измененный пример из официальной документации):

function file_force_download($file) < if (file_exists($file)) < // сбрасываем буфер вывода PHP, чтобы избежать переполнения памяти выделенной под скрипт // если этого не сделать файл будет читаться в память полностью! if (ob_get_level()) < ob_end_clean(); >// заставляем браузер показать окно сохранения файла header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename=' . basename($file)); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate'); header('Pragma: public'); header('Content-Length: ' . filesize($file)); // читаем файл и отправляем его пользователю readfile($file); exit; > > 

Таким способом можно отправлять даже большие файлы, так как PHP будет читать файл и сразу отдавать его пользователю по частям. В документации четко сказано, что readfile() не должен создавать проблемы с памятью.

  • Скрипт ждет пока весь файл будет прочитан и отдан пользователю.
  • Файл читается в внутренний буфер функции readfile(), размер которого составляет 8кБ (спасибо 2fast4rabbit)

2. Читаем и отправляем файл вручную

Метод использует тот же Drupal при отправке файлов из приватной файловой системы (файлы недоступны напрямую по ссылкам):

function file_force_download($file) < if (file_exists($file)) < // сбрасываем буфер вывода PHP, чтобы избежать переполнения памяти выделенной под скрипт // если этого не сделать файл будет читаться в память полностью! if (ob_get_level()) < ob_end_clean(); >// заставляем браузер показать окно сохранения файла header('Content-Description: File Transfer'); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename=' . basename($file)); header('Content-Transfer-Encoding: binary'); header('Expires: 0'); header('Cache-Control: must-revalidate'); header('Pragma: public'); header('Content-Length: ' . filesize($file)); // читаем файл и отправляем его пользователю if ($fd = fopen($file, 'rb')) < while (!feof($fd)) < print fread($fd, 1024); >fclose($fd); > exit; > > 
  • Скрипт ждет пока весь файл будет прочитан и отдан пользователю.
  • Позволяет сэкономить память сервера

3. Используем модуль веб сервера

3a. Apache

Модуль XSendFile позволяет с помощью специального заголовка передать отправку файла самому Apache. Существуют версии по Unix и Windows, под версии 2.0.*, 2.2.* и 2.4.*

В настройках хоста нужно включить перехват заголовка с помощью директивы:

Также можно указать белый список директорий, файлы в которых могут быть обработаны. Важно: если у Вас сервер на базе Windows путь должен включать букву диска в верхнем регистре.

Читайте также:  Javascript document createelement appendchild

Описание возможных опций на сайте разработчика: https://tn123.org/mod_xsendfile/

function file_force_download($file) < if (file_exists($file)) < header('X-SendFile: ' . realpath($file)); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename=' . basename($file)); exit; >> 
3b. Nginx

Nginx умеет отправлять файлы из коробки через специальный заголовок.

Для корректной работы нужно запретить доступ к папку напрямую через конфигурационный файл:

Пример отправки файла (файл должен находиться в директории /some/path/protected):

function file_force_download($file) < if (file_exists($file)) < header('X-Accel-Redirect: ' . $file); header('Content-Type: application/octet-stream'); header('Content-Disposition: attachment; filename=' . basename($file)); exit; >> 

Больше информации на странице официальной документации

  • Скрипт завершается сразу после выполнения всех инструкций
  • Физически файл отправляется модулем самого веб сервера, а не PHP
  • Минимальное потребление памяти и ресурсов сервера
  • Максимальное быстродействие

Update: Хабраюзер ilyaplot дает дельный совет, что лучше слать не application/octet-stream , а реальный mime type файла. Например, это позволит браузеру подставить нужные программы в диалог сохранение файла.

Источник

How to upload large files above 500MB in PHP [duplicate]

I made an upload page in PHP, but I dont know why the page would not upload documents greater than 500MB, This is my first time of trying to upload something that large, I changed all the configurations in the PHP.INI (post_max_size = 700M, upload_max_filesize = 600M, and max_execution_time = 300). The codes for upload is as below

@all, sorry my server was down,the error returned is undefined index name, as if the file upload input field does not exist, or sometimes it says execution timeout, its frustrating me

Execution timeout would make sense on move_uploaded_file due to the copy operation it does. As for the timeout, fear not — transfer of the data to the server is NOT included in the timeout. By the time PHP starts, your request has already arrived.

3 Answers 3

Do you think if increasing upload size limit will solve the problem? what if uploading 2GB file, what’s happening then? Do you take into consideration the memory usage of such a script?

By configuration, PHP only allows to upload files up to a certain size. There are lots of articles around the web that explain how to modify this limit. Below are a few of them:

For instance, you can edit your php.ini file and set:

memory_limit = 32M upload_max_filesize = 24M post_max_size = 32M 

You will then need to restart apache.

Note:
That being said, Uploading large files like that is not very reliable. Errors can occur. You may want to split the files, and include some additional data for error corrections. One way to do that is to use par recovery files. You can then check the files after upload using the par command line utility on unix-like systems.

I assume you mean that you transferring the files via HTTP. While not quite as bad as FTP, its not a good idea if you can find another of solving the problem. HTTP (and hence the component programs) are optimized around transferring relatively small files around the internet.

While the protocol supports server to client range requests, it does not allow for the reverse operation. Even if the software at either end were unaffected by the volume, the more data you are pushing across the greater the interval during which you could lose the connection. But the biggest problem is that caveat in the last sentence.

This question is in a collective: a subcommunity defined by tags with relevant content and experts.

Источник

Оцените статью