Skip to main content
Nintex Community Menu Bar

Has anyone started using the Data Set options in RPA central for their bots?  I cannot seem to find a guide anywhere on it and was wondering how to program the bot to read the data set.  I thought it would be setting a variable to input and then, once the job and data set is created, it would be able to read from there and set the variable in RPA.  That does not seem to be the case.  Any ideas on how to get the bot to read the data set from Central?

 

Kyle

Kyle,



At this time, the data set has to be defined both at RPA Central and RPA botflow.  Once you set the data set, in RPA Central you can create a Job, assign it the botflow, one or multiple bots and the data set imported from RPA Central.  We are planning on enhancing this in the near future such that you can define the data set in RPA, and RPA Central will identify the definition from the botflow.  @chubbardsr1 recently used the RPA Central data set function for an SBA PPP use case.



 



Sasan


@Sasan,

That first statement is the problem. How do we define it in a RPA botflow. I could not see an action to define it other than the standard data import. @chubbardsr1, could you provide your example please.

Kyle

When you are building your script you will use the data that you load to RPA.  I used my CSV file to import to RPA, and then built the script.  The fields will be the same in that CSV Access Database version as they will be in RPA Central DataSet.  Once you have the script built and tested, you will load your CSV to the DataSet job and all of the fields you referenced will be there.  If you add a field during import, you need to make sure they are added to the CSV for the dataset job.  The only fields that are not included in the DataSet are LOGTEXT, RECORDNMBR, ADDED.  All of your fields will be there.  


So if the point of the data set, another place you can see which records were updated each run?

Kyle,



Hope I got your question right...The Data Set functionality in RPA Central gives you the capability to assign multiple bots to the file, to process the file faster.  



Sasan


Once the dataset is processed you can download the file to see what has been done.  With the current dataset, I ended up creating a field in my csv called LogMessage.  This way I could put a message in there.  But the biggest feature I used was the WriteLog action in RPA so I could see which ones processed, failed, errors, etc.  The dataset is not like running a single access file where you have the final output with the "A" Added column.  But you can work around it by having a field in your dataset that you write to, and then do a download when it's done.  
I used DataSet when I needed to run 2000+ SBA Loans to a core system.  This way I could assign multiple bot machines to handle the work load since it took about 3 mins to board 1 loan.  I did have to work around the DataSet not tracking what was done and when it would be re-ran outside of RPA, but this method worked great by allowing me to board 200 loans/hour (10 Bots with 1 dataset). @kbarton 


Reply