Line By Line Heavy Task

When you get a csv or txt file, where every line has to be processed with some heavy task, the timeout may thwart your plans. Not only timeout, but also memory limit or mysql maximum queries error. Generally web servers “don’t like” long and heavy processes. It happened to me to find all users images in a huge database after their email addresses, which I got in a csv list. The web server let me only to process only two thousand per refresh and I needed more – much more. So I created such a class, which takes an input file and the callback function and runs some limited amount of loops. After that it saves the current input file pointer and in another refresh it continuous to read input file from the place it had finished before.

Using it

Using it is very simple. The one thing, it requires, is a writtable tmp directory for saving a tmp file with the pointer.

require_once 'LineByLineTask.php';
$task = new Picios\Lib\LineByLineTask('bigfile.csv', function($line, $op) {
	echo "{$line}<br />";
});

$task->setLimit(4)->run();

In the example above my task is not heavy at all, but it could be some file searching, database querying etc. In every refresh you get another set of lines limited by

$task->setLimit(4);

You can download the source code on from
LineByLineTask on Github

0 thoughts on “Line By Line Heavy Task

  • PHPSucker says:

    I used it to convert my database schema, it takes some time for 400k rows, so thanks. btw, the better documentation would be nice.

    • Hi PHPSucker, ok, I’ll improve the documentation. Thanks for your feedback.

  • hi, great job. I used your class as a newsletter sender. it does work well with cronjob thx.

    • Hi Gary, nice to see you found it useful. Sending a newsletter is a typical job for this class.

Comments are closed.