I have a workflow (2013 - On Prem) that queries a large list, and writes all records from the previous month to a CSV file then PUTs the file in a library via the Web Request PUT method. When testing with small record sets, everything works fine. The production use of this will write about 5000 records to the CSV.
When looping (For Each Action) through the collection variables and writing each field value to the Build String action, the workflow fails part way the process. After about 5 minutes of looping, the process just stops with no explicit errors. the only error is "BackUp Easterner Status Records failed to start". Mind you, it has definitely started because verbose logging shows that it has progressed quite a way into the process. It is not failing on a particular record either. Some times it will write a few hundred records and sometimes it writes just over 100.
Does anyone have any ideas as to what the problem may be?
Thanks and Best Regards,
Solved! Go to Solution.
put a pause action into your for each loop and let it pause once after X iterations.
you will have to test what the value of X will be suitable for your environment, but I would say 1000 might be a reasonable value.
Hi Marian Hatala,
Thanks for the response! I will give that a try. Will be closer to 100 as it is failing sometimes at 180 and sometimes at 350ish. I will let you know what happens.
Thanks and Regards,
I set it up and tried it with a pause after 100 and it failed at 96. I lowered the threshold to 50 and it is still running so far and is on its 3rd pause.
With 5000 items this looks to be over an 8 hour workflow process.
This seems like an ok workaround but I wonder what is causing the issue in the first place. Any ideas? While 50 may work for now, it seems like it still may be unreliable since there is no definite breakpoint in either time or record count. Why do you think this might be happening?
Patrick Kelligan I'm not able to answer this.
just my personal guess is that it is some limitation on sharepoint side (since similar problem quite regularly appears with long running loops or actions that need to process bigger payloads), either some timeout or sp batch size???
maybe someone more experienced with nintex or sp internals could clarify it.
seeing your timing assumption, I don't know whether it's an option for you but download data from client/consumer (excel?) side would be much more effective
Patrick, is this a workflow you're running manually? Try it as a scheduled workflow to start it. I've seen similar issues when processing large volumes of records and running the WF manually. If I run it scheduled - it has better luck at processing all requested items. Another thing that may help is to add a Commit Pending Changes at the bottom of your loop. This would be preferable to using the pause because the smallest interval is 5 min. Multiply that times 5000 items and your workflow will spend more time paused then it will processing the items.
I know it seems weird relying on luck - but remember this is SharePoint by Microsoft. Seriously though, there does seem to be some workflow processing limitation in either buffering the workflow actions or in the processing.
Good luck and post any details you might uncover.
I will try this. The Commit pending Changes is a good idea. I am still running the first test with a 50 record threshold form last night. I will also try a scheduled WF to see if it runs ok.
Hi Gerard Rodriguez,
I tried running it manually with the Commit action and it failed. I just set up a schedule that should kick off here in a few to see if it matters if it scheduled or not. Will keep you posted.
Ok Gerard Rodriguez... It is still running after several hours from a scheduled WF instead of a manual trigger and without the pause. This is encouraging.
Thanks for the tip!