[Vtigercrm-developers] Time Interval workflow exhausts memory

Uma S uma.s at vtiger.com
Fri Feb 5 12:53:22 GMT 2021


Hi All,

In my opinion on this scenario, I would agree with Mohan's point with
PearDB data fetch.

>>I think the issue is with PearDB and Create/Update API as (if I recall
correctly) at each new call of the api it would add 8+ mb..

Yes! this could be related to the result set fetch through PearDatabase.
Where each data retrieved will be having 2 keys in the result,
one is an integer and the other is a column name key. Which can consume
more RAM when more data is retrieved in a single request.

example:
[fields] => Array
        (
            [0] =>
            [firstname] =>
            [1] => test again lead
            [lastname] => test again lead
            [2] =>
            [title] =>
            [3] => 0
            [accountid] => 0
            [4] =>
            [email] =>
            [5] =>
            [phone] =>
            [6] => 1
            [smownerid] => 1
            [7] => 6
            [contactid] => 6
            [8] => 0
            [starred] => 0
        )

Solution would be to find a way to fetch data with a one format, Either
integer or column name.

We will be reviewing this further to come up with a solution, which adds
value to performance.

>>Also, it would be a good idea to send an email to the admin user upon
vtigercron.php execution failure.
Ruben, we will review with this memory exhausted error to notify the user
on cron job. Please do file an issue on same.

On Fri, Feb 5, 2021 at 2:38 PM Alan Lord <alanslists at gmail.com> wrote:

> On 05/02/2021 08:50, Sukhdev Mohan wrote:
> > Increasing memory doesn’t solve at all, already tried it, at least in my
> > case didn’t solve had to divide and rule hopefully it has to be done
> > once in a while). I think the issue is with PearDB and Create/Update API
> > as (if I recall correctly) at each new call of the api it would add 8+
> > mb.. I can test this weekend and share more info. If you have any
> > insight or experiences please do tell.
>
> Because I haven't had this particular problem before I have no
> experience of it to share.
>
> There have been occasions however, where I have found I needed to
> increase RAM or switch to a cli sapi (no RAM limit) when doing mass
> record import or whatnot - but I am talking hundreds of thousands or
> millions of records. And generally this was more down to
> capacity/performance limitations of the h/w. Writing a shell script and
> iterating over the task using smaller blocks has also been a solution
> for me. But in Rubén's case this is not possible as it is a scheduled
> workflow.
>
>
> Al
>
> _______________________________________________
> http://www.vtiger.com/



-- 
With
Best Regards
Uma.S
Vtiger Team
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.vtigercrm.com/pipermail/vtigercrm-developers/attachments/20210205/74474f33/attachment.html>


More information about the vtigercrm-developers mailing list