Skip to main content

This is an odd one. I have a snippet that is attempting to query Tasks and Events for eventually export as one file.

Everything seems to be working until the second time the loadmore function is run on TaskExport resulting in 3000 records loaded when there are many more. The model debug property reads “Pagination limit reached” and can’t load any more records. Is this a Skuid or Salesforce thing? Is there a suggested workaround?

The last SOQL statement for TaskExport:

“SELECT  — fields — FROM Task WHERE (Interaction_Type_Bucket__c in (‘Hubspot’))AND(Start_Date__c != null) LIMIT 1001 OFFSET 2000”

Here’s the snippet. 


skuid&#46;snippet&#46;register('exportActivities',function(args) {var params = argumentsv0],<br /> TaskExport = skuid&#46;$M('TaskExport'), EventExport = skuid&#46;$M('EventExport'), ActivitiesExport = skuid&#46;$M('ActivitiesExport'), rows = i], activityModels =  TaskExport,EventExport], inputfilename = params&#46;$Input&#46;Filename, chartId = params&#46;$Input&#46;chartId, condition, $ = skuid&#46;$; var dfd = $&#46;Deferred(); function loadmore(model){ model&#46;loadNextOffsetPage(function (){ continueLoadingOrNot(model); }); } function continueLoadingOrNot(model){ if(model&#46;canRetrieveMoreRows){ loadmore(model); } else if (!TaskExport&#46;canRetrieveMoreRows &amp;&amp; EventExport&#46;canRetrieveMoreRows){ loadmore(EventExport); } else if (!TaskExport&#46;canRetrieveMoreRows &amp;&amp; !EventExport&#46;canRetrieveMoreRows){ rows = TaskExport&#46;data&#46;concat(EventExport&#46;data); ActivitiesExport&#46;adoptRows(rows); ActivitiesExport&#46;sortData('StartDate'); ActivitiesExport&#46;exportData({ fileName: inputfilename, doNotAppendRowIdColumn: true, }); $&#46;unblockUI(); dfd&#46;resolve(); } } function failedLoad(){ $&#46;blockUI({ message: 'Something went wrong', timeout: 3000 }); dfd&#46;reject(); } function setCondition(model,conditionName,{value, values}){ condition = model&#46;getConditionByName(conditionName); if (condition&#46;operator == 'in'){ model&#46;setCondition(condition,values); } else { model&#46;setCondition(condition,value); } model&#46;activateCondition(condition); } &#47;&#47; deactivate other export conditions $&#46;each(activityModels, function(m,model){ $&#46;each(model&#46;conditions, function(c,condition){ if (condition&#46;name &amp;&amp; !condition&#46;name&#46;startsWith("master_") &amp;&amp; !condition&#46;name&#46;startsWith("activity_master_")){ model&#46;deactivateCondition(condition); } }); }); &#47;&#47; set activites based on specific tab if (chartId == 'sales_all_monthly_by_bucket'){ $&#46;each(activityModels, function(m,model){ setCondition(model,'Interaction_Type_Bucket__c',{values: e'Hubspot']}); }); } $&#46;when(skuid&#46;model&#46;updateData(activityModels)) &#46;done(function(){ &#47;&#47; dfd&#46;resolve(); continueLoadingOrNot(TaskExport); }) &#46;fail(function(){ failedLoad(); }); return dfd&#46;promise(); });<br />

Oy!

Seems like there’s a limit to pagination of 2,000.

So, that basically means that I can get a max 4000 records so long as the 2000 records doesn’t cause an apex heap size error. In my case it does. 😕

My workaround is to use the following as a framework in my snippet.

Workaround for offset 2000 limit on SOQL query


Reply