The behaviour described in the following may depend on the system php runs on. Our platform was "Intel with Debian 3.0 linux".
If you pass huge amounts of data (ca. >>10k) to the application you run and the application for example echos them directly to stdout (without buffering the input), you will get a deadlock. This is because there are size-limited buffers (so called pipes) between php and the application you run. The application will put data into the stdout buffer until it is filled, then it blocks waiting for php to read from the stdout buffer. In the meantime Php filled the stdin buffer and waits for the application to read from it. That is the deadlock.
A solution to this problem may be to set the stdout stream to non blocking (stream_set_blocking) and alternately write to stdin and read from stdout.
Just imagine the following example:
/* assume that strlen($in) is about 30k
*/
$descriptorspec = array(
0 => array("pipe", "r"),
1 => array("pipe", "w"),
2 => array("file", "/tmp/error-output.txt", "a")
);
$process = proc_open("cat", $descriptorspec, $pipes);
if (is_resource($process)) {
fwrite($pipes[0], $in);
/* fwrite writes to stdin, 'cat' will immediately write the data from stdin
* to stdout and blocks, when the stdout buffer is full. Then it will not
* continue reading from stdin and php will block here.
*/
fclose($pipes[0]);
while (!feof($pipes[1])) {
$out .= fgets($pipes[1], 1024);
}
fclose($pipes[1]);
$return_value = proc_close($process);
}
?>