We have many folders being copied into an SP online document library - circa 9000 per day. Each folder contains an average of 10 files. This currently breaks the 5000 list query control limit when converting these folders and files into document sets. I'm thinking about creating multiple document libraries to round robin the incoming folders to reduce the data in each library, allowing the list query control to work correctly on each library. This means creating circa 20 doc libraries to accommodate the daily incoming data. Anyone able to provide any experience of performing a round robin workflow on data drop library to move 1st folder to library 1, 2nd folder to library 2 ------- 21st folder to library 1?
Very interesting challenge, do you care which library each set ends up in, or just that they are roughly evenly distributed?
I shared this one with the team to get some ideas, I think this is the most elegant way we could achieve this. Obviously not sure of other aspects of the document storage and what you are doing but from a workflow perspective I think this will work.
The trick will be to do two things:
For the web request, you should be able to do effectively the following:
This will select the lists that start with 'Bucket', order them by item count so that the bucket with the fewest items is on top, then pick the top one and return you the title & count.
One nice thing about this solution is if you need to add more buckets, just add more and they will automatically start being filled before the others as they will have fewer items.
The restriction with this solution would be that since the copy document action cannot accept a variable as a destination library, you would need to use the more flexible Office 365 Download File action to move it. If I can think of a better idea I will let you know.
Thanks for the time spent looking into this. I'll incorporate the above and let you know how I get on. This certainly looks like a valid approach.