Skip to main content

I am using the below script to select all rows in a model and create new rows in a different model. However it is only selecting the initial page load number of lines and I need all rows that would be returned. Is there a way to do that without setting the “Mass number of records” to blank?

For example the “Mass number of records” field would be set to 20 but the script would pull in all the records the model should have based on the conditions. If I leave the “Mass number of records” blank we run into the issue of heap size errors.

Script:


var $ = skuid.$;&nbsp;<br>var models = skuid.model.map();<br>var inventoryLocation = models.LocationInventoryPosition;<br>var cycleCountLines = models.CycleCountLines;<br>$.each(inventoryLocation.data,function(){<br>&nbsp; &nbsp; var row = cycleCountLines.createRow({<br>&nbsp; &nbsp; &nbsp; &nbsp; additionalConditions: l<br>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; { field: 'SCMC__Item_Master__c', value: this.SCMC__Item_Master__r.Id},<br>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; { field: 'SCMC__Inventory_Location__c', value: this.SCMC__Bin__r.Id},<br>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; { field: 'SCMC__Status__c', value: "New"},<br>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; { field: 'SAC_Inventory_Position__c', value: this.Id},<br>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; { field: 'SCMC__Inventory_Quantity__c', value: this.QuantityfromAgg}<br>&nbsp; &nbsp; &nbsp; &nbsp; ], doAppend: true<br>&nbsp; &nbsp; });<br>});

Hey Tami
It may not be a “clean” solution, but you can loop the model.loadNextOffsetPage() function while model.canRetrieveMoreRows() returns true. So you can first get all the rows in the model before launching your script without running into the heapsize problem on pageload. Just keep in mind that this can cause really long loadingtimes depending on the size of your model.


This is what I would suggest. Additionally, an idea to load rows in groups on page load would be useful. Ie. 200 x 8 or 200 until all loaded. The issue for this is if you plan to edit or create many many rows. Could have problems saving the records if the number of records approaches 1000+.


Thanks for pointing me in this direction. In this article there is a loadAllRemainingRecords() call. I added this to the top of my script and also as a seperate script run before the script that creates the lines in a different model but it is not working as expected. Individually both scripts work as expected but not together.

I need all the rows to load first then select all the rows in the model to be created in a new model.

skuid.$.blockUI({ message: ‘Loading all available Accounts…’ });skuid.$M(“LocationInventoryPosition”).loadAllRemainingRecords({
   stepCallback: function(offsetStart,offsetEnd) {
      skuid.$.blockUI({ message: 'Loading Records ’ + offsetStart + ’ to ’ + offsetEnd + ‘…’ });
   },
   finishCallback: function(totalRecordsRetrieved) {
      skuid.$.blockUI({ message: ‘Finished loading all ’ + totalRecordsRetrieved + ’ Accounts!’, timeout: 2000 });
   }
});
var $ = skuid.$;var models = skuid.model.map();
var inventoryLocation = models.LocationInventoryPosition;
var cycleCountLines = models.CycleCountLines;


$.each(inventoryLocation.data,function(){
    var row = cycleCountLines.createRow({
        additionalConditions: r
            { field: ‘SCMC__Item_Master__c’, value: this.SCMC__Item_Master__r.Id},
            { field: ‘SCMC__Inventory_Location__c’, value: this.SCMC__Bin__r.Id},
            { field: ‘SCMC__Status__c’, value: “New”},
            { field: ‘AC_Inventory_Position__c’, value: this.Id},
            { field: ‘SCMC__Inventory_Quantity__c’, value: this.QuantityfromAgg}
        ], doAppend: true
    });
});


have you tried breaking down load and save both into pieces, i assume you may come across saving/loading errors if the datasize gets to big, maybe try loading all data in 200 records steps and once everything is loaded try saving them in the new model also in 200 steps.

Since both scripts are working seperately, i assume they just can’t handle the sheer size


Thanks! I will give it a try.