[Vtigercrm-developers] batch update task eats up memory

Alan Lord alanslists at gmail.com
Fri Oct 23 06:45:55 GMT 2020


A few ideas.

1. Do not instantiate a new db object for every iteration of the loop! 
Move $db = PearDatabase::getInstance(); to before the foreach loop. You 
only need one instance ;-)

2. If it's still a problem, write a wrapper in bash which runs the php 
file based on a counter and passes in the value, so you could do then in 
blocks of say 1000 rows or whatever and have a LIMIT on your query. This 
would then close the php file and re-open it again. Stops memory issues 
like this.

3. Use a common db library/interface (that opens handles to both 
databases) and then change your query so that you UPDATE with a JOIN on 
the other database.

HTH

Al


On 23/10/2020 04:32, Rubén A. Estrada Orozco wrote:
> Hi everyone,
> 
> So I have a script that does something like this:
> 
> image.png
> It gets data from an sql server database and updates accounts' fields in 
> vtiger.
> It's a greatly simplified version, but the essence is that of the above 
> code.
> 
> I execute if from the command line. But after some time, I get an 
> "Allowed memory size of 92274688 bytes exhausted"
> In this example I use only 5000 records instead of the actual ~32k I 
> need to update. And I lowered PHP's memory_limit to 88MB from 1GB to be 
> able to get to the error quickly.
> 
> What you can see in the following screenshots is that memory usage keeps 
> increasing until all available memory is exhausted. Why would that 
> happen? I thought it should remain  constant. It's as if PearDatabase 
> was eating up the memory.
> 
> image.png
> 
> image.png
> 
> This is supposed to run a few times a day and it needs to be efficient.
> 
> Any thoughts?
> 
> Saludos
> 
> Rubén
> 
> _______________________________________________
> http://www.vtiger.com/
> 


More information about the vtigercrm-developers mailing list