php proc open 返回,PHP: proc_open - Manual

The behaviour described in the following may depend on the system php runs on. Our platform was "Intel with Debian 3.0 linux".

If you pass huge amounts of data (ca. >>10k) to the application you run and the application for example echos them directly to stdout (without buffering the input), you will get a deadlock. This is because there are size-limited buffers (so called pipes) between php and the application you run. The application will put data into the stdout buffer until it is filled, then it blocks waiting for php to read from the stdout buffer. In the meantime Php filled the stdin buffer and waits for the application to read from it. That is the deadlock.

A solution to this problem may be to set the stdout stream to non blocking (stream_set_blocking) and alternately write to stdin and read from stdout.

Just imagine the following example:

/* assume that strlen($in) is about 30k

*/

$descriptorspec = array(

0 => array("pipe", "r"),

1 => array("pipe", "w"),

2 => array("file", "/tmp/error-output.txt", "a")

);

$process = proc_open("cat", $descriptorspec, $pipes);

if (is_resource($process)) {

fwrite($pipes[0], $in);

/* fwrite writes to stdin, 'cat' will immediately write the data from stdin

* to stdout and blocks, when the stdout buffer is full. Then it will not

* continue reading from stdin and php will block here.

*/

fclose($pipes[0]);

while (!feof($pipes[1])) {

$out .= fgets($pipes[1], 1024);

}

fclose($pipes[1]);

$return_value = proc_close($process);

}

?>

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值