php读取的json文件比较大,使用PHP讀取多個大型JSON文件的最佳方法是什么

I am writing a PHP script (to be run from command line) to parse hundreds of large JSON files. All of these files are in a directory. Initially I was reading the files one by one and parsing them in the same script, but ran out of memory quickly. The other way to do it is to have two scripts, one to read the directory, get the list of files and call another script with the file name to be parsed as arguments. Is there any other way to do it?

我正在編寫一個PHP腳本(從命令行運行)來解析數百個大型JSON文件。所有這些文件都在一個目錄中。最初我是逐個讀取文件並在同一個腳本中解析它們,但是內存很快耗盡。另一種方法是使用兩個腳本,一個用於讀取目錄,獲取文件列表並調用另一個腳本,並將文件名稱作為參數進行解析。還有其他辦法嗎?

Also, is there any way to parallelize this?

另外,有沒有辦法並行化這個?

1 个解决方案

#1

1

Try unsetting variables after you are done with them, that should free memory allocated to those variables.

在完成變量后嘗試取消設置變量,這應該釋放分配給這些變量的內存。

Edit: better yet, as I read, you assign null to those variables, that frees memory faster, and more efficiently:

編輯:更好的是,在我閱讀時,您為這些變量分配null,這樣可以更快,更高效地釋放內存:

$myNoLongerUsedVar = null;

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值