it's very likely that attempt to update 50k+ items in one go will fail.
either due to sharepoint resource limits or due to timeout.
you will have to break up the whole dataset into smaller chunks and make updates chunk by chunk.
one possible approach could be as Eden suggested to update it item by item. but as he depicts it it will still fail for so huge dataset. you will have to assure that there is called commit pending changes action within the loop regularly after reasonable amount of items is processed.
the other approach is that you split the dataset into chunks by some flag field directly in the list and make updates (update multiple items) just for items having the same flag in one go. you read more on that approach here Big Data and Nintex: Batch Processing Large Datasets in Parallel Using a Nintex Site Workflow - Shar...
that sounds reasonable.
UpdateListItems method of Lists.asmx webservice has eg. documented limit of 160 items updated per single batch.
however, I haven't seen documented any limit for nintex' update multiple items action, neither I know how it is internally implemented and so whether this limit applies to it or not.
I have a same Greg's problem. I have the site workflow scheduled to run at 4am every day but it doesn't run for all items. It is suppose to set a field to the current date, however it doesn't do this for all of the items. I started the workflow manually to test my workflow logic and the workflow is working fine but the site workflow is not automatically trigger the workflow to run and it sent the email workflow notification "Workflow Schedule failed".
I also followed the instructions on the article "Scheduled workflows and the Nintex Workflow Scheduler timer job"
My System admin did
1. NWAdmin.exe -o UninstallTimerJob -job ScheduledWorkflows
2. NWAdmin.exe -o InstallTimerJob -job ScheduledWorkflows -url http://mywebappurl
but the workflow still hasn't run automatically. Did I miss anything? Any suggestions?