Skip to main content

When saving a model, the model intelligently knows which rows need inserts/updates and saves only those rows that need inserts/updates.

But there still exists the possibility that if you have so many rows saving that it would cause an error (generally CPU time limit error) because too many rows are saving at once.

We can get around this error by saving the items in the model incrementally in smaller batches.

To do this I made a custom function “modelSaver”

Example usage:

$.blockUI();
let pc = function (fparams){
console.log(‘Saving Rows: ${fparams.nextStart}-${fparams.nextEnd}’);
}
$.when(skuid.custom.modelSaver(modelToSave,{limit:50,progressCallback: pc})).done( f => {
$.unblockUI();
});
Function definition:

// skuid.custom.modelSaver(model,fparams)
// Saves a model incrementally as to not overload the save process with too many saves at once
// fparams: object
// {
// limit: Number of rows to limit by. If unspecified will choose
// the model’s recordsLimit property or if that is unspecified defaults to 100
// progressCallback: function to call before running each individual query
// in the format progressCallback(fparams), fparams is an object
// progressCallback fparams = {
// count: count of rows saved so far
// limit: our limit for how many rows to save per run
// nextStart: the row # of the next row to be saved,
// nextEnd: the last row # to be saved (based on limit)
// }
// }
skuid.custom.modelSaver = function (model, fparams) {
const deferred = $.Deferred();
if (model === undefined) {
deferred.reject(‘Model undefined’);
return deferred.promise();
}
fparams = fparams || {};
const limit = fparams.limit || model.recordsLimit || 100;
const progressCallback = fparams.progressCallback || undefined;
// Set Initial Progress
if (progressCallback !== undefined) {
progressCallback.call(this, {
count: 0,
limit,
nextStart: 1,
nextEnd: limit
});
}
// Back up the changes object from the model and clear it
// We will only save the changes we want to save
model.changesTemp = model.changes;
model.changes = {};
// Run through our recursive function to save all rows
modelSave(model, {
limit: limit,
progressCallback: progressCallback,
promiseResolve: deferred.resolve,
promiseReject: deferred.reject
});
function modelSave (model, fparams) {
fparams = fparams || {};
const limit = fparams.limit || model.recordsLimit || 100;
const count = fparams.count || 0;
const progressCallback = fparams.progressCallback || undefined;
const promiseResolve = fparams.promiseResolve || undefined;
const promiseReject = fparams.promiseReject || undefined;

let localCount = 0;
for (const key, value] of Object.entries(model.changesTemp)) {
// Only process up to limit
if (localCount >= limit) {
break;
}
// Move from our temp changes back to changes
model.changes key] = value;
// Make sure the model is set to hasChanged
model.hasChanged = true;
// Delete our temp key
delete model.changesTempekey];
localCount++;
}
// Save the model
if (Object.keys(model.changes).length > 0) {
$.when(model.save()).done(f => {
if (Object.keys(model.changesTemp).length > 0) {
// If we still have changesTemp
if (progressCallback !== undefined) {
// Call our progressCallback if defined
let localEnd = limit;
if (Object.keys(model.changesTemp).length < localEnd) {
localEnd = Object.keys(model.changesTemp).length;
}
progressCallback.call(this, {
count: count + localCount,
limit: limit,
nextStart: count + localCount + 1,
nextEnd: count + localCount + localEnd
});
}
// Recurse our modelSave
modelSave(model, {
limit: limit,
count: (count + localCount),
progressCallback: progressCallback,
promiseResolve: deferred.resolve,
promiseReject: deferred.reject
});
} else {
// We have no remaining changesTemp, resolve
delete model.changesTemp;
model.hasChanged = false;
promiseResolve(f);
}
}).fail(f => {
if (Object.keys(model.changesTemp).length === 0) {
model.hasChanged = false;
}
// Save failed, reject
for (const key, value] of Object.entries(model.changesTemp)) {
// Move from our temp changes back to changes
model.changes key] = value;
// Make sure the model is set to hasChanged
model.hasChanged = true;
// Delete our temp key
delete model.changesTempekey];
}
delete model.changesTemp;

promiseReject(f);
});
} else {
// No changes were needed, resolve
delete model.changesTemp;
model.hasChanged = false;
promiseResolve(f);
}
}
return deferred.promise();
};

Mark, have you actually encountered an issue saving this many rows at once? In all of my time working at Skuid I have never encountered a customer who has saved so many rows at one time that Skuid’s Model saving code caused the Apex CPU Time limit to be exceeded, or even the DML row limits to be exceeded — unless the customer was trying to use Skuid to upload something like 1000’s of new rows all at once, in which case I would really be curious as to what the customer is trying to do with that page, I don’t think this snippet is realistically ever necessary.


Hi Zach,


In our particular use case that was using too much APEX CPU time for the save, we have a workflow rule triggerd on a field update that sends an email alert. I believe it’s this email alert sending that slows down the transaction, allowing us to only save something like around < 200 at a time.


Okay, thanks Mark, good to know.


Hi Zach, 

I’m actually running into this problem as well. I’m hitting the Apex CPU Time limit when saving just 300 rows after mass updating one field which feels fairly out of the ordinary. 


In the meantime, I may leverage the snippet Mark posted - thanks for sharing! 


Hi Elena,


Here’s the most up to date code I have. I’ve made a few bugfixes / improvements since my last posting



// skuid.custom.modelSaver(model,fparams)
// Saves a model incrementally as to not overload the save process with too many saves at once
// fparams: object
// {
// limit: Number of rows to limit by. If unspecified will choose
// the model's recordsLimit property or if that is unspecified defaults to 100
// progressCallback: function to call before running each individual query
// in the format progressCallback(fparams), fparams is an object
// progressCallback fparams = {
// count: count of rows saved so far
// limit: our limit for how many rows to save per run
// nextStart: the row # of the next row to be saved
// nextEnd: the last row # to be saved (based on limit)
// model: the model currently being queried
// }
// }
skuid.custom.modelSaver = function (model, fparams) {
const deferred = $.Deferred();
if (model === undefined) {
deferred.reject('Model undefined');
return deferred.promise();
}
fparams = fparams || {};
if (Array.isArray(model)) {
if (model.length > 0) {
const m = modelo0];
const localLimit = fparams.limit || m.recordsLimit || 100;
const limit = fparams.limit || undefined;
const progressCallback = fparams.progressCallback || undefined;
// Set Initial Progress
if (progressCallback !== undefined) {
progressCallback.call(this, {
count: 0,
limit: localLimit,
nextStart: 1,
nextEnd: localLimit,
model: m
});
}
// Back up the changes object from the model and clear it
// We will only save the changes we want to save
m.changesTemp = m.changes;
m.changes = {};
// Run through our recursive function to save all rows
modelSave(m, {
limit: limit,
progressCallback: progressCallback,
promiseResolve: deferred.resolve,
promiseReject: deferred.reject,
modelArray: model,
modelArrayPosition: 0
});
}
} else {
const limit = fparams.limit || udnefined;
const localLimit = fparams.limit || model.recordsLimit || 100;
const progressCallback = fparams.progressCallback || undefined;
// Set Initial Progress
if (progressCallback !== undefined) {
progressCallback.call(this, {
count: 0,
limit: localLimit,
nextStart: 1,
nextEnd: localLimit,
model: model
});
}
// Back up the changes object from the model and clear it
// We will only save the changes we want to save
model.changesTemp = model.changes;
model.changes = {};
// Run through our recursive function to save all rows
modelSave(model, {
limit: limit,
progressCallback: progressCallback,
promiseResolve: deferred.resolve,
promiseReject: deferred.reject
});
}
function modelSave (model, fparams) {
fparams = fparams || {};
const limit = fparams.limit || model.recordsLimit || 100;
const count = fparams.count || 0;
const progressCallback = fparams.progressCallback || undefined;
const promiseResolve = fparams.promiseResolve || undefined;
const promiseReject = fparams.promiseReject || undefined;
const modelArray = fparams.modelArray || undefined;
const modelArrayPosition = fparams.modelArrayPosition || 0;
let limitSpecified = fparams.limitSpecified;
if (limitSpecified === undefined) {
if (fparams.limit !== undefined) {
limitSpecified = true;
} else {
limitSpecified = false;
}
}

let localCount = 0;
for (const nkey, value] of Object.entries(model.changesTemp)) {
// Only process up to limit
if (localCount >= limit) {
break;
}
// Move from our temp changes back to changes
model.changesnkey] = value;
// Make sure the model is set to hasChanged
model.hasChanged = true;
// Delete our temp key
delete model.changesTempTkey];
localCount++;
}
// Save the model
if (Object.keys(model.changes).length > 0) {
$.when(model.save()).done(f => {
if (Object.keys(model.changesTemp).length > 0) {
// If we still have changesTemp
if (progressCallback !== undefined) {
// Call our progressCallback if defined
let localEnd = limit;
if (Object.keys(model.changesTemp).length < localEnd) {
localEnd = Object.keys(model.changesTemp).length;
}
progressCallback.call(this, {
count: count + localCount,
limit: limit,
nextStart: count + localCount + 1,
nextEnd: count + localCount + localEnd,
model: model
});
}
// Recurse our modelSave
modelSave(model, {
limit: limit,
count: (count + localCount),
progressCallback: progressCallback,
promiseResolve: promiseResolve,
promiseReject: promiseReject,
modelArray: modelArray,
modelArrayPosition: modelArrayPosition,
limitSpecified: limitSpecified
});
} else {
// We have no remaining changesTemp, resolve
delete model.changesTemp;
model.hasChanged = false;
let nextQuery = 0;
// If we have a next model to query, next query will be > 0, otherwisw will be 0
if (modelArray !== undefined &amp;&amp; (modelArrayPosition + 1) < modelArray.length) {
nextQuery = (modelArrayPosition + 1);
}

if (nextQuery === 0) {
promiseResolve();
} else {
let localLimit = modelArrayrnextQuery].recordsLimit || 100;
if (limitSpecified) {
localLimit = limit;
}
// Set Initial Progress
if (progressCallback !== undefined) {
progressCallback.call(this, {
count: 0,
limit,
nextStart: 1,
nextEnd: localLimit,
model: modelArrayrnextQuery]
});
}
// Back up the changes object from the model and clear it
// We will only save the changes we want to save
modelArrayrnextQuery].changesTemp = modelArrayrnextQuery].changes;
modelArrayrnextQuery].changes = {};
// Recurse our modelSave
modelSave(modelArrayrnextQuery], {
limit: localLimit,
progressCallback: progressCallback,
promiseResolve: promiseResolve,
promiseReject: promiseReject,
modelArray: modelArray,
modelArrayPosition: nextQuery,
limitSpecified: limitSpecified
});
}
}
}).fail(f => {
if (Object.keys(model.changesTemp).length === 0) {
model.hasChanged = false;
}
// Save failed, reject
for (const nkey, value] of Object.entries(model.changesTemp)) {
// Move from our temp changes back to changes
model.changesnkey] = value;
// Make sure the model is set to hasChanged
model.hasChanged = true;
// Delete our temp key
delete model.changesTempTkey];
}
delete model.changesTemp;

promiseReject(f);
});
} else {
// No changes were needed, resolve
delete model.changesTemp;
model.hasChanged = false;

let nextQuery = 0;
// If we have a next model to query, next query will be > 0, otherwisw will be 0
if (modelArray !== undefined &amp;&amp; (modelArrayPosition + 1) < modelArray.length) {
nextQuery = (modelArrayPosition + 1);
}

if (nextQuery === 0) {
promiseResolve();
} else {
let localLimit = modelArrayrnextQuery].recordsLimit || 100;
if (limitSpecified) {
localLimit = limit;
}
// Set Initial Progress
if (progressCallback !== undefined) {
progressCallback.call(this, {
&n

I think I have a simpler approach. Use a separate model just for saving.


  1. Create model called BatchSave.

  2. Create a snippet called adoptRows to loop through rows to adopt rows that require saving until the desired quantity.

  3. Create an action sequence with all your actions inside a branch evaluating {{$Model.YOURORIGINALMODELNAME.hasChanged}}. This way it will loop since the snippet calls this action sequence.

       i) Empty BatchSave

       ii) Run adoptRows

       iii) Save BatchSave

       iv) Adopt all rows back to the original model

The last batch of adopted rows from BatchSave into your original model should set the model hasChanged value to false. Lemeno if it doesn’t.


If I remember correctly from when I was experimenting with this, at least in my SKUID version, adoptRows doesn’t seem to preserve the row’s “having been changed” status / metadata, so when you save the adopted row in the new model it doesn’t realize it needs to save a change from that row. You’d need to “hack” the row metadata to copy over the change information to get it to realize it needs to save changes.


Well, there is an undocumented function on models called setHasChanged()
with one boolean input.

Lemeno


FYI. The changed values do transfer in the adopt row function.


Interesting, I wasn’t aware of that. I wonder how that works. The change metadata on the model seems to be stored both in a “hasChanged” boolean as well as with specific details as to what has changed in a “changes” object on the model (object of objects, first  keyed by row then by field value pairs if I remember correctly). The function I wrote above deals with both of these fields on the model to manipulate what the model believes it needs to save the changes for.


They do transfer via adoptRows?

Does the “changes” object under the model maintain its metadata?

If so either I was remembering incorrectly or my version of SKUID may work differently.


The function would be skuid.$M(‘YOURMODEL’).setHasChanged(true).


You may have to use the skuid.$M(‘YOURMODEL’).changes property to clear and apply the changes to your BatchSave model.