Using Nintex Automated Cloud and SFTP/FTP to store a file
Hi Team,
Just wondering if people can help. I need to create a .csv file as the output for some forms and use SFTP (ideally) or FTP to store the .csv in a desired network location.
Have other people come across how to store the output as a .csv? if so how?
Also, can you save that file or any other within Nintex Automated Cloud using SFTP or FTP?
Cheers!
Page 1 / 1
Hi @Viatarel
As far as I know there is no out of box connector to create CSVs, it has become a bit of a dated format now, there may some third party connectors that support it like the google sheets connector perhaps? I have not tested.
It may be possible to just build the CSV string in the correct format as all the content of a CSV is, is comma separated values with formatting functions like \n for new line but again actually getting that file might be tricky.
Also SFTP and FTP technologies are not API friendly and require middleware to interact with, if no alternate solutions are possible then a custom service that converts the typical JSON output in a form/workflow then uses the appropriate library to perform the SFTP transfer would be the only option.
Alternatively have you thought about better methods of consuming Nintex workflow data? All data contained within workflow can be exposed to an odata feed via our insights platform, that means whatever destination you have for the data might be better pulling the information it needs from there, it will require minimal coding and means regular updates and realtime data.
perhaps it’s worth exploring possibilities, is there any solid limitation preventing process for the consumption of data from changing?
Jake
@Jake
Thanks for the information. I haven't dealt with the form data in json format yet. What workflow actions can you suggest for gathering the JSON output of the form where I can make a start
Is there any relevant information you can point towards for using the odata feed? I have looked at that for reporting but that could be a simple solution. When I previously looked for the information, it was very minimal with connecting odata feeds.
Cheers
Hi @Viatarel
Any collection or object will already be JSON, it is our standard method of storing data as it is generally the de facto data format in the web for APIs.
You can view the JSON by simply placing a object or collection variable in a log to instance, or you can use it in part of the string builder to store it in a text variable.
To export information from workflow into the odata feed its important to get access to the insights portal first, after that you can use beacons to export workflow data into the odata table.
Standalone beacon - Used to store data to the feed at that point of action, great for as values are known to change or beginning of the workflow to track start data.
Beginning beacon - Used in pair with an ending beacon to store data and track time between beacons, best placed at the beginning of a chunk of the process considered to be a timed action such as approvals or reviews.
Ending beacon - Used to store and end the tracked time from a beginning beacon but also show has values changed since the start beacon, such as overall value increase or decrease.
Then following this article there are a number of options shown on how to access the data, one of them being excel, perhaps you could build a single file template that when refreshed will update the information in the excel.