I think the power of generators is underestimated here, look at my example:
* simple example class just to have something to instantiate
*/classobj{
private$i=1;
private$a= [];
function__construct($i=1) {$this->i=$i;$this->a=range(0,$i);
}
public functiongetI() {
return$this->i;
}//more getters and setters...}/**
* this is a common way of returning objects in a bulk
* @param int $n
* @return \obj
*/functionreturnObjects($n=1000) {$objs= [];
for ($i=1;$i<=$n;$i++) {$objs[] = newobj($i);
}
return$objs;
}/**
* this is a better way using generator, rather than returning all objects,
* it returns one by one (it saves the state of the function in every call)
* @param int $n
*/functiongenerateObjects($n=1000) {
for ($i=1;$i<=$n;$i++) {/**
* 'yield' returns the object and save the status of the function, so
* next call starts from next loop iteration and so on...
*/yield (newobj($i));
}
}//main script: get current memory, run one of the functions and calculate memory usage after$m=memory_get_peak_usage();/**
* comment 'returnObjects()' call bellow and uncomment 'generateObjects()' call
* if you want to see the generator memory usage
*/
//$objs = returnObjects();
/**
* comment 'generateObjects()' and uncomment 'returnObjects()' call if you
* want to see the common function return memory usage
*/$objs=generateObjects();
foreach ($objsas$obj) {
echoget_class($obj) .":{$obj->getI()}\n";
}
echo"total memory comsuption: ". (memory_get_peak_usage() -$m) ." bytes\n";?>
what is the outcome? Using the 'returnObjects()' we return an array within 1000 objects, but using the 'generateObjects()' we only instantiate one object since the yield returns (stop the loop) but also saves the state of the function, so next call the function resumes rather than restarting. In my environment I got a difference of 37K to 25M
Thanks to my dear friend Ivan Frezza who helped me to understand this better!