cron - Processing a large amount of data using jobs in php -


We have a website that has 'project' on it with a certain number of people subscribed to each project.

On 'End' of a project, all subscribers gather and process. In this case, there are about 1,000 subscribers whose data needs to be dragged and processed. There is also some related data that is stored in any data with each data set.

The last time we processed a large order, ~ 300 items were exited from PHP memory, we increased memory and it was able to process it. I do not think this time will be the case.

We currently employ those jobs that drag the project to process the customer. In this work, the SQL query is executed to pull all the 'subscribers' and store them and their related data in the array.

My question is:

Is there a way to do this 'block' or something else? Or better way that the effect on memory will be reduced?

Now it flows:

  • The project ends
  • Prevent the array of data from MySQL (Subscriber) Given, and a separate job has been created for each customer.
  • Every customer job is processed by the engine.

    I just have a hard time looking at the best process to do this.

    I preferably use the main RS sorted on the int-key, and only A customer process.

    At the end of the partial job, save the work ID.

      Lastly, if your work is done or not, call it back:  

    (File_get_contents ('http: //yourscript.php')! = False);

    (Do not use overhead but protects you from memory leaks)

Comments

Popular posts from this blog

excel vba - How to delete Solver(SOLVER.XLAM) code -

github - Teamcity & Git - PR merge builds - anyway to get HEAD commit hash? -

ios - Replace text in UITextView run slowly -