Skip to main content

I have built a site workflow that is taking rows from a list (it's an overtime claim list) and building a CSV file for the payroll team to process onwards for reporting. I tackled this by querying the list and passing each of the fields needed into collection variables. Then it pauses for 1 minute to dehydrate the job to the SharePoint timer service. Next, it passes into a for-each loop, where each of the collections are cycled through in turn to produce the variables. This is done with parallel actions (12). Some of the parallel actions need some regex on the output variables to remove commas (we are saving to a CSV at the end). Within the loop, I am building a string value to collect all the row item data into a chunk.

Once exited from the loop, another build string is used to produce headers and then adds the chunk with the appropriate commas to separate. Finally an email goes off to the payroll team with the CSV attached.

This worked for test, when I only had a few hundred items in the list, but now that has risen to nearly 2500 items, which is not large in SharePoint terms, but I am not sure whether the workflow is able to handle the volume of process actions. With this number of items, the workflow starts quite quickly, but then waits for a lot more than one minute (the pause or 1 minute). I can see a status shows as in progress.

 

22994i586C1ED5CC7D8361.png

22995iE4B3867B847F466A.png22996iBF4670A48DF0FF84.png

After about 20 minutes, workflow history shows that there is an error. Scrolling to the bottom of the history (a long way down), the error simply states: Error System Account Overtime Payroll Extract failed to run. Am I going about this in the best way, as it seems that I am running into a timeout? When I look at the history, I can see it has recorded a large number of actions and then seems to stop, mid flow during one of the loops.

Is there a way to streamline this or improve the efficiency? If there is a timeout, can it be adjusted, and if so, to what value?

Hopefully someone can help me find a solution to this. Thanks in advance.

It looks like my problem is definitely to do with processing a large number of list items. I tried to reduce the date span down from an entire month, to just 6 days. When I did this, the workflow seemed to work through without issue. I found some items on this forum relating to batching the workflow into processing chunks at a more manageable size and also.



So, I added a Run-If action set to run every 200 loops, which sets the loop counting variable back to zero and pauses for 5 minutes. This is supposed to allow the workflow to handle the larger list.



Here is the modification of the lead-in and first part of the For-Each loop:







Then the other side of the parallel actions is the lead out of the loop, with these actions. (I am creating a manual history list to capture each item as it loops through.





 



What is happening though, is not quite what I was hoping for. The workflow ran overnight and it seems that the pauses are taking much longer than the configured values, followed by errors, then continuing. Here is the log output:





Have I got something wrong, or is there a chance that there is an environmental problem?



Getting somewhat desperate to find a solution to these issues.



Thanks in advance.


SmashedMonkey, I’ve run into the same issue, but with 45000 items in the list. My only success has been reducing the For-Each chunks to around 100 with a 5-minute pause. Anything over that and it will choke pretty quick. I’ve been able to process about 5000 items at a time using that size.

I too am seeing though that a 5-minute Pause turns out to be more like 8 or 11 minutes, and it varies for some reason. Surely has to do with the WorkFlow Manager and the batching process, but I haven’t been able to identify the issue.

Any insight Nintex support could provide would be helpful.

TIA


SmashedMonkey, I’ve run into the same issue, but with 45000 items in the list. My only success has been reducing the For-Each chunks to around 100 with a 5-minute pause. Anything over that and it will choke pretty quick. I’ve been able to process about 5000 items at a time using that size.

I too am seeing though that a 5-minute Pause turns out to be more like 8 or 11 minutes, and it varies for some reason. Surely has to do with the WorkFlow Manager and the batching process, but I haven’t been able to identify the issue.

Any insight Nintex support could provide would be helpful.

TIA

gman, you are right, this is such a monumental pain. There has been nothing helpful from Nintex, sadly. Since I posted the original item, we have migrated all our forms and workflows to Nintex Cloud. I now have the migrated and modified version of my workflow in Nintex Automation Cloud. Guess what, it still doesn’t handle these large lists. The loop function has a 1000 item limit. There is also a size limit on variables (100Mb), which is impossible to manage up front. The other problem is that since NAC and our SharePoint instance are on different tenants, there is a significant latency when doing lots of list item updates. It is almost unusable now.


SmashedMonkey, that doesn’t surprise me. Sounds like a difficult way to do business though. Perhaps you can negotiate with Microsoft to improve the throttling and other limits on your tenants. I suspect they keep them low until someone really complains in order to maximize server resources.

Our solution is to drop any workflow automation that relies on Microsoft/SharePoint and find a BPA tool that has their own workflow engine (K2 is a real good candidate for this).

Good luck.


Hi @SmashedMonkey, @gman 

Have you considered using PowerShell PnP to query/read the data from SPO and generate text/CSV/Excel as the output? 

This will purely be a system batch processing


Garrett, not sure I fully understand your question. Are you suggesting PnP for outputting to Excel or are you asking how we approach it?

Yes, in fact we use PowerShell heavily to extend all of our workflow activity (on-premises). However, we still need a business-user front end for our back-end PowerShell activities, hence the move from SharePoint’s workflow engines to someone else’s. Also, our interface with SPO is typically one-way, from our intranet out to SPO for read-only sharing. We have no long-term plans to move to the cloud.

Regards


Garrett, not sure I fully understand your question. Are you suggesting PnP for outputting to Excel or are you asking how we approach it?

Yes, in fact we use PowerShell heavily to extend all of our workflow activity (on-premises). However, we still need a business-user front end for our back-end PowerShell activities, hence the move from SharePoint’s workflow engines to someone else’s. Also, our interface with SPO is typically one-way, from our intranet out to SPO for read-only sharing. We have no long-term plans to move to the cloud.

Regards

Hi @gman 

This is to suggest a possible solution to @SmashedMonkey initial query

 

You are able to use PowerShell PnP to connect to SP Online.
As I have stated, this is purely for batch processing hence there is no users interactivity.
PowerShell has an extension to read and write Excel files.

 

PowerShell is great at reading and extracting data from large SP/SPO List (Even > 45000 rows) with faster execution speed than NAC.
NAC is excellent at workflow automation which has manager approval. 
NAC is excellent when forms are required (friendly UI)


Garrett, not sure I fully understand your question. Are you suggesting PnP for outputting to Excel or are you asking how we approach it?

Yes, in fact we use PowerShell heavily to extend all of our workflow activity (on-premises). However, we still need a business-user front end for our back-end PowerShell activities, hence the move from SharePoint’s workflow engines to someone else’s. Also, our interface with SPO is typically one-way, from our intranet out to SPO for read-only sharing. We have no long-term plans to move to the cloud.

Regards

Hi @gman 

This is to suggest a possible solution to @SmashedMonkey initial query

 

You are able to use PowerShell PnP to connect to SP Online.
As I have stated, this is purely for batch processing hence there is no users interactivity.
PowerShell has an extension to read and write Excel files.

 

PowerShell is great at reading and extracting data from large SP/SPO List (Even > 45000 rows) with faster execution speed than NAC.
NAC is excellent at workflow automation which has manager approval. 
NAC is excellent when forms are required (friendly UI)

Thanks Garrett,

PnP PowerShell is not really going to work for us. We need a solution that non-technical Finance people can easily start from a formatted front-end. I am actually now looking at using Power Automate to do what I need.


It might be useful to look at what the Throttling limit is in SharePoint and also in the Query settings on your SQL Server.  Both can limit the number of items that are processed/and or queried. 

 


The solution (of sorts), was basically to limit the number of items to 200 maximum, via the start event form. Now the workflow will process those 200 items without error. The only downside is that the workflow needs to be re-run multiple times to process the total number of items. Each run of 200 items can take up to 2 hours to complete, so you can see that this solution is really not a practical one in this application. We took the decision to use Power Automate instead for this, and it works against thousands of items in a sensible amount of time (about 40 minutes).

On the subject of throttling. Microsoft swear blind that they do not throttle clients tenants, but in this and other applications and solutions around or organisation that rely on MS Cloud technologies, we see similar poor performance when larger datasets are being processed. I guess we will never know for sure!

Thanks for all the help on this :)


Reply