@Sankarlal Had not seen a reply. Did you see the technical documentation on work queue capabilities?
Thank you for your response, @Sasan
After reviewing the link, you provided, it appears that the Kryon task queue is responsible for hosting tasks and implementing the task-based calendar and other related settings.
However, my inquiry is focused on the work queue function, which is designed to facilitate the creation of a robust RPA solution capable of handling large volumes of transactional items and deploying multiple bots to process the same process in order to meet SLA requirements. The work queue serves as a transaction database that stores all item-level transactional data and updates data and status accordingly.
I would like to know if the Nintex Kryon RPA platform offers such capabilities. If not, what is the recommended approach in Nintex RPA for automating large volumes of transactional data with multiple automation bots in order to meet SLA requirements?
@Sankarlal Had not seen a reply. Did you see the technical documentation on work queue capabilities?
Thank you for your response, @Sasan . After reviewing the link you provided, it appears that the Kryon task queue is responsible for hosting tasks and implementing the task-based calendar and other related settings.
However, my inquiry is focused on the work queue function, which is designed to facilitate the creation of a robust RPA solution capable of handling large volumes of transactional items and deploying multiple bots to process the same process in order to meet SLA requirements. The work queue serves as a transaction database that stores all item-level transactional data and updates data and status accordingly.
I would like to know if the Nintex Kryon RPA platform offers such capabilities. If not, what is the recommended approach in Nintex RPA for automating large volumes of transactional data with multiple automation bots in order to meet SLA requirements.?
Hey @Sankarlal,
Thanks for the question.
To process large volumes (for example, 10,000 records) of transaction data faster, we recommend to split the file into different files (for example, five files of 2000 records) and run it by other robots.
We recommend dividing the file and not working on the same file.