Hello!
I have a lengthy For-Each based workflow that loops over web based InfoPath forms in a library, and calculates how old they are, so that email reminders can be sent as appropriate. This is to remind people they have forms that they have created, but have not yet finalised.
The action of the workflow is to send a reminder X days after the form has been created, every Y days thereafter, and then finally delete it after Z days. X, Y and Z are parameters that I look up, as they are different for each form template.
I have attached a screenshot (sorry for size..!)
Logically, I think I now have a workflow that is correct, but I hit the following error which suspends the workflow:
In a particular library I have over 400 forms that need to be processed every day. I am aware that as I am working within the O365 and SharePoint Online environment that I have no control over the server side timings/settings etc so I believe by only avenue is to optimise the workflow further....however I am now stuck as I do not know where I can make further savings.
I am therefore looking for some general advice...
Many thanks for reading!
Andrew
Perhaps you could make your workflow processes 100 items and mark them as "Analized", then run another workflow instance for another 100... upto all items are "Analized"
Hello thank you for checking this thread.
The workflow already does this I'm afraid, it does 5 batches of 100 forms, marking each one with a processed date.
Andrew - I think the problem is the Loop N times action.
I would recommend using rather a loop on condition action if possible - as there is no getting around the 5,000 loop limit.
Hello, thanks for checking.
The Loop N actually only loops 5 times, so that the library queries 100 forms at a time (it was timing out otherwise)
Have you tried using a for each instead of an Loop N?
I would suggest adding a wait for a pause for duration action in each loop. It could be that you are just asking too much of the system in too short a time. A workflow history comment inside the loop will let you know how many times the loop turns before it fails. I have found that I've needed to add these pauses into Nintex workflows and it does seem to pace-out the workflow and stop errors.
Hi All,
I have been able to improve performance by removing all my Query XML actions bar one. I'm not sure if this will fix my ultimate 5000 request problem but I am hopeful that querying less times each iteration will help.
I use Query XML to get values from the returned Fields XML. Originally I was doing multiple Query XML actions for each meta data key, which was quite slow and resource intensive. My new method is to query the XML once with a wildcard:
And put the results in to a Collection variable. Then you can pop off each item using Get Item From Collection remembering that both the key AND the value will be in there, e.g.
So you have to Get Item From Collection using specific indexes (the odd numbers, basically)
Thanks
If you've found your own solution, Andrew Whitmore​, feel free to mark your own answer as "correct."