Skip to main content
Nintex Community Menu Bar
Solved

Workflow: [Update an item] actions very slow

  • July 11, 2024
  • 1 reply
  • 95 views

SmashedMonkey
Forum|alt.badge.img+5

I built some workflows in the last part of last year. Many are medium complexity. I’m looking for some guidance on how I can speed some of them up. The under performing workflows are mostly those that make numerous updates to the SharePoint online lists - status columns etc.

The type of column does not seem to have any effect on the time taken, but it seems to take around 30 seconds, to a minute to make one update. The connector is using a list/library connection with a identity that has full site collection access permissions. On one particular workflow that basically sets a flag on each item in a list view (based on date constraints), takes more than two hours to run through the for-each loop. The corpus size in that instance was only about 210 items.

On the previous version of Nintex for SharePoint 2016 on-premises, the same workflow would be done in around 10 minutes.

Is there a method I can use to update list columns via a loop, in a more efficient way? One that will take a fraction of the current amount of time. My fear is that the list view has potential for between 2000 and 2500 items.

Here is the config on the action that updates a date and yes/no column on the SharePoint site. It runs upon each iteration of the loop.

 

Best answer by SmashedMonkey

It seems that this may be related to some sort of throttling going on at our SharePoint tenant. Microsoft say that they are not applying any throttling, but we are seeing other processes and applications (not Nintex) that are being affected by fairly poor performance when the number of items being processed, climbs into the order of thousands.

I have since managed to build a different flow solution to process these items using Power Automate, and that seems to manage to process items in minutes, rather than hours. So, for this particular application, I don’t think Nintex workflows are really going to work on such large volumes of data. It’s not a fault of the product, but I think we were asking too much of it in this case. All our other workflows are performing very well, since they are all single item level flows.

View original
Translate
Did this topic help you find an answer to your question?

1 reply

SmashedMonkey
Forum|alt.badge.img+5
  • Author
  • Scholar
  • 39 replies
  • Answer
  • August 9, 2024

It seems that this may be related to some sort of throttling going on at our SharePoint tenant. Microsoft say that they are not applying any throttling, but we are seeing other processes and applications (not Nintex) that are being affected by fairly poor performance when the number of items being processed, climbs into the order of thousands.

I have since managed to build a different flow solution to process these items using Power Automate, and that seems to manage to process items in minutes, rather than hours. So, for this particular application, I don’t think Nintex workflows are really going to work on such large volumes of data. It’s not a fault of the product, but I think we were asking too much of it in this case. All our other workflows are performing very well, since they are all single item level flows.

Translate

Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie Settings