I have 3 source lists that feed into a master list. Each source list contains data pulled from a different database, and using 3 site workflows (1 per source), get updated or added in to the master list overnight. 2 of these work perfectly fine, but the 3rd always fails after about 1000 records. The first 2 contain about 200 items each, but the 3rd contains around 2400. There is no helpful error message when it fails, just "An error has occurred in ... ".
Previously, we were using a standard SPD list workflow that ran on every item's update/create, and took 5-6 hours to complete all of them. The new Nintex site workflow processes about 1000 updates in 45-50 minutes, which is a huge improvement.
A summary of the workflow is:
We have tried increasing and decreasing the number of records per commit, but that doesn't make a difference. It is always failing after about 1000 records. The other 2 workflows that work are identical to this one, just pointing to a different source.
This is on Sharepoint 2013 with Nintex Workflow 2013 (220.127.116.11). This workflow is supposed to be step 4 of 7 in the nightly update, but it's not very useful in its current state. Does anybody have any ideas why it fails every time?
It's hard to say that there is a specific timeout that would be occurring. Does it fail on the same action every time? Maybe when it is querying the record to fill out the variables there is an issue with a specific record and may just happen to be item 1000 or so.
The issue may due to large item processing. You are querying all the items from the list and looping in the for-each loop. As you mentioned there may be around 2400 records to work on when the workflow runs there may be performance issue with the workflow. As a suggestion, I am recommending following changes in your workflow design:
Let the workflow process the items in batch say 200 items at a time (200 because, as you mentioned other lists contains 200 items and you have found no issues with it)
List Design Changes:
Workflow Design Changes:
Since each instance of workflow process only 200 records, you may need to increase the frequency of the schedule for this particular workflow. E.g: Since on an average every night it has to process 2000 records, you may need to run the workflow like 10 times every night. (If you want to reduce the frequency you can increase the query item limit instead of 200 depending on the workflow performance)
Hope you can come up with some solutions with my recommendation.
We just tried this, and it seems to ignore the item limit and returns all records that match the filter. I set the item limit to 500, but it processed over 800 records before I stopped the workflow. We also tested outputting the size of the collection, and it returned the full list record count and not 500.
Any idea why it would ignore that limit?
Please find the link from Vadim Tabakman's blog which explains on optimization of query list functionality:
Thanks again for your help Nidish. It turns out It was failing on the same record, and there is something wrong with one of the fields in that record. I have configured it to run in batches of 600, and now it runs 4 times each night without problem.
This also lead to discovering a bug in the "Specify item limit", where it always triples whatever value I enter (if I enter 10, it runs against 30 records).