[Vtigercrm-developers] Time Interval workflow exhausts memory

Rubén A. Estrada Orozco rulotec1 at gmail.com
Fri Feb 12 16:49:31 GMT 2021


I've opened the issue Uma:
https://code.vtiger.com/vtiger/vtigercrm/issues/1572
However, now that I think about it, I'm not sure if a memory exhaustion
error can be notified. I think that kind of error would cause an immediate
crash.

On the other hand, I don't think the db problem is just a matter of
inefficiency in the sense that more memory is used than necessary per
query. I think there is a memory leak issue. Please refer to this
discussion:
http://vtiger-crm.2324883.n4.nabble.com/Vtigercrm-developers-batch-update-task-eats-up-memory-td23640.html
where a batch update using vtws library causes memory to gradually increase
up to exhaustion.

Saludos

Rubén


On Fri, Feb 5, 2021 at 6:55 AM Uma S <uma.s at vtiger.com> wrote:

> Hi All,
>
> In my opinion on this scenario, I would agree with Mohan's point with
> PearDB data fetch.
>
> >>I think the issue is with PearDB and Create/Update API as (if I recall
> correctly) at each new call of the api it would add 8+ mb..
>
> Yes! this could be related to the result set fetch through PearDatabase.
> Where each data retrieved will be having 2 keys in the result,
> one is an integer and the other is a column name key. Which can consume
> more RAM when more data is retrieved in a single request.
>
> example:
> [fields] => Array
>         (
>             [0] =>
>             [firstname] =>
>             [1] => test again lead
>             [lastname] => test again lead
>             [2] =>
>             [title] =>
>             [3] => 0
>             [accountid] => 0
>             [4] =>
>             [email] =>
>             [5] =>
>             [phone] =>
>             [6] => 1
>             [smownerid] => 1
>             [7] => 6
>             [contactid] => 6
>             [8] => 0
>             [starred] => 0
>         )
>
> Solution would be to find a way to fetch data with a one format, Either
> integer or column name.
>
> We will be reviewing this further to come up with a solution, which adds
> value to performance.
>
> >>Also, it would be a good idea to send an email to the admin user upon
> vtigercron.php execution failure.
> Ruben, we will review with this memory exhausted error to notify the user
> on cron job. Please do file an issue on same.
>
> On Fri, Feb 5, 2021 at 2:38 PM Alan Lord <alanslists at gmail.com> wrote:
>
>> On 05/02/2021 08:50, Sukhdev Mohan wrote:
>> > Increasing memory doesn’t solve at all, already tried it, at least in
>> my
>> > case didn’t solve had to divide and rule hopefully it has to be done
>> > once in a while). I think the issue is with PearDB and Create/Update
>> API
>> > as (if I recall correctly) at each new call of the api it would add 8+
>> > mb.. I can test this weekend and share more info. If you have any
>> > insight or experiences please do tell.
>>
>> Because I haven't had this particular problem before I have no
>> experience of it to share.
>>
>> There have been occasions however, where I have found I needed to
>> increase RAM or switch to a cli sapi (no RAM limit) when doing mass
>> record import or whatnot - but I am talking hundreds of thousands or
>> millions of records. And generally this was more down to
>> capacity/performance limitations of the h/w. Writing a shell script and
>> iterating over the task using smaller blocks has also been a solution
>> for me. But in Rubén's case this is not possible as it is a scheduled
>> workflow.
>>
>>
>> Al
>>
>> _______________________________________________
>> http://www.vtiger.com/
>
>
>
> --
> With
> Best Regards
> Uma.S
> Vtiger Team
> _______________________________________________
> http://www.vtiger.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.vtigercrm.com/pipermail/vtigercrm-developers/attachments/20210212/6506f88c/attachment.html>


More information about the vtigercrm-developers mailing list