Skip navigation
All Places > Tech Blog > Blog
1 2 3 Previous Next

Tech Blog

236 posts

I am going to demonstrate a Production Planning process that is powered by Nintex Workflow in this blog post. Before we get further into the production planning process, let us recap what we have gone through together in my previous sharing in Part 1 of GKK Compressor Industry (i.e. RFQ to Quotation process). In the Sales and Marketing division, Product Catalog is being used to create RFQ, which in turn generates quotations issued to customers. Every quotation issued to customer updates the Sales Forecast with the increased number of compressor model to be delivered.


Being a Lean Manufacturing, Production Planning is crucial to GKK Compressor Industry, the critical success factor is produced only what is needed to be delivered on time. It keeps no unnecessary inventory to minimize the waste of inventory space. Tentative Production Plan is always three months ahead of current month, helps the Procurement to keep the Material Management efficient, knowing what material and quantity of material to order with advance knowledge of the requirement from the production.


Key techniques in the production planning to be shared:

  1. Nintex Form Web Part embedded to the "Draft Production Plan" list/page
  2. "Loop" action alternative for immediate execution
  3. Formatting a monthly calendar with CSR


The Production Plan

The Production Plan is a plan used in the production division with details on what models and quantity of compressor to be produced for a particular month. A production line has its daily production capacity that is dependent on machine and human resource capacity. If a production line can go up to 60 pieces of compressor per calendar day with full capacity running three work shifts, two work shifts will give capacity of 40 pieces.


To get the Actual Production Plan, we draft a production plan from monthly sales forecast with details on what models and quantities to be delivered. The diagram below demonstrates how it's being created powered by Nintex Workflow. The sample plan shows how the quantity of each model to be produced spread over different days with daily capacity of 30 pieces per day for October 2017. 

To produce a total of 71 pieces of compressor for the first model (i.e. SVT125-35), the work has to be spread into three days, with the third day producing the remaining 11 pieces required to make it 71, since on the third day we produced only 11 pieces for the model, it has the remaining capacity of 19 to produce the next model in line (i.e. SHT277NA in the example).


1. Nintex Form Web Part embedded to the "Draft Production Plan" list/page

I find Nintex Form Web Part very useful especially when I need to make Sharepoint page interactive, but it wasn't discussed a lot. The Draft Production Plan is just one of the solutions I make use of Nintex Form Web Part to get user to specify or collect Production Plan parameters (i.e. Month, Year, Capacity, and to or not to include week ends as working days for the production plan we are drafting). To include that, I simple embed the "Create Production Plan" workflow's start form to the "Draft Production Plan" view/page as shown below. Once the form is submitted (i.e. with the Start button in the example), the "Create Production Plan" site workflow will be triggered, and the Draft Production Plan list will be refreshed with the production plan that was drafted by the site workflow.


2. "Loop" action alternative for immediate execution

The algorithm I used for drafting a production plan is summarized as below. I have then realized, it took hours for the workflow to complete a "Production Plan", reason being the "Loop" action will be executed every 5 minutes by default as it is executed by the Sharepoint Timer Job, even If I configured the Timer Job to run every 1 minute (i.e. minimum interval by Sharepoint), it still take a long time for the workflow to be completed.

  Prepare an empty "Draft Production Plan" view (delete all the existing list items)
  Query the "Sales Forecast" for models to be produced for the specified month
  For-each models in the list
     Create a list item for the model to be produced
     Split the total quantity for the model into the number of days required
     Loop through each calendar day of the production month
            Update the daily quantity for the model to the list item (i.e. exclude/include week end)
     End Loop
  End For-each

We have no choice but to consider to use the "For Each" iteration. Even if you use the State Machine, it is still depending on the Timer Job interval constraint. In order to do that, we will need to calculate the number of loop required, and create a collection of the required number to use the For-each iteration. So, if the quantity to be produced is 71, 71 divide by daily capacity of 30 equals to 2.3667. We get the whole number of 2 leaving behind the fraction, as 2 will be used as collection index of 2 for three iteration (i.e. index starts at 0). This is true if we always start a day with the new daily capacity (e.g. 30), but we will need to consider what if capacity for the day is not always starts from the beginning. In the sample production plan above, the quantity of 22 for model SNT207V is taking 2 days to produce, because the capacity of the day it starts the production was 1 (i.e. 30-29=1) as 29 was used to produce the model prior to it. To get the right count of iteration required, we will need to add the remaining quantity of the last model before we divide the quantity with the capacity, the two calculation actions shown below give the right formula to get the required number of iteration.


3. Formatting a monthly calendar with CSR

From the above Draft Production Plan example, I have customized the "Draft Production Plan" custom list into a calendar look and feel, by coloring the columns representing "week end" into grey. The Custom Side Rendering (i.e. CSR) is a feature by Sharepoint is being used to get the calendar view.

SP.SOD.executeFunc("clienttemplates.js", "SPClientTemplates", function() { 
          OnPostRender: function(ctx) {
               var rows = ctx.ListData.Row;
               var month = rows[0]["Month"];
               var year = rows[0]["Year"];
               var weekends = [];
               var date = new Date(year, month, 0);
               var lastDay = date.getDate();
               for (var d=1;d<=lastDay;d++)
                    if (date.getDay()==0 || date.getDay()==6){
               for (var i=0;i<rows.length;i++)
                         var rowElementId = GenerateIIDForListItem(ctx, rows[i]);
                         var tr = document.getElementById(rowElementId);
                         for (var j=0, len=weekends.length; j<len;j++){
                              var td=tr.cells[weekends[j]+2];
                     = "#eeeeee";

I have include the script as JSLink shown below


AuthorPalesa Sikwane
Long Description

Do you need to scan through a SharePoint Sites contents and apply some logic based on the existence of a content type? Or maybe even go as far as scheduling this? Then this post is for you!

  • Nintex® Workflow 2010
  • Nintex® Workflow 2013
  • or Nintex® Workflow 2016


**And the applicable SharePoint® version for the Nintex® Workflow versions above. 

Support Info

Palesa Sikwane

Additional Information


Product used

Nintex® Workflow 2016 | Version: - International


Greetings from a warm and sunny side of Johannesburg, South Africa! It's finally Spring time in this side of the world!


First things first, I would like to thank Michelle Barker for coming up with this question and requirement!



As you guys would know, reusable workflows have the following properties:

  • Are associated with SharePoint® Content Types
  • Because they are linked to a content type, they can be made available wherever those content types are used in a site collection

Some of the advantages of using a reusable workflow are:

  • Having one Central place to make changes and publish a workflow and push it down to every list or document library
  • Making use of site templates that have content types with workflows linked to them


Some of the disadvantages of reusable workflow:

  • Inability to schedule a reusable workflow out of the box
  • Cannot be tied to a site workflow meaning that you cannot really schedule it using a site workflow. 


What if then you were faced with a requirement that needs you to run a scheduled workflow based on a content type which could span across different lists and libraries? Maybe you have a requirement to send out reminder(s) at a certain frequency for certain document types? 


How would you go about it?

Well there is a way that involves:


1. Finding all of your lists on your site

2. Getting all linked content types for each list

3. Per content type, Switch and apply your logic.


I built this using the following site workflow:


Step #Workflow ActionDescription

Call web service action (Get List Collection - Internal Names)

  • This step involves calling the GetListCollection SharePoint webservice:
    • This allows us to get a collection of all the Lists and Libraries in our current site. The collection will store the InternalName of our lists, or what some of you might refer to as the GUID.
  • I've configured this workflow action as follows:

    Configuring Get List Collection Webservice to get the InternalName


Note: I've configured this to actually return the InternalName of my lists or the GUIDs in a collection variable called ListCollection. You will see how and where we use this later...

2.Collection Operation

Here I simply count how many lists are in my collection and store this in a variable called No Lists:



Note: Note that I also log my result within the same action.

3. For Each

This is my first For Each where I basically loop through my ListCollection I retrieved in Step 1, and store the InternalName or the GUID of each list in a variable called ListGUID. The configuration is as follows:



Log in history list

Here I basically log the current lists index in my collection(ListCollection) as well as the ListGUID:


Call web service (Get List Content Types)

Next, I call another SharePoint Web Service called GetListContentTypes based on the current ListGUID that my For Each in Step 3 is looking at.


I then store the results in a collection variable called List Content Types. This step is configured as follows:


Note: The results returned here are in XML, and one of the NODES in the XML contain the Name of the content type, which is what we want


Query XML

Seen that in Step 3.2 I get returned some XML, I simply query the XML in this step to get the Display Name of my content types and store these in a collection variable called Content Type Names. I've configured this step as follows:


For each

This is my second For Each; which allows me to iterate through each content types name stored in my collection Content Type Names:


Log in history list

Here I simply log the name of my content type:


Switch (per content type)

Note that I haven't put this in my workflow, but this is where the magic happens for you guys. 


So because we've found content types for any list or library across our site with the previous steps, we can use a conditional action such as a Switch to put in some workflow logic as required per content type! 


Running this workflow will give you the following result in your workflow history:

So as you can see my Workflow scans the current site, and for each list and library returns the associated content types. 

Feel free to download this and play around with it, and do what you will with it   Maybe even make it a User Defined Action?  


Let us know how you guys used this!


I hope this helps someone out there! 



Several community members came close, but didn't quite get this one, so I'm extending it through the month of October!


This month, instead of a special errand, we're going to reward you for what you do anyway. Think of it as a bonus.

For every fifth time your answer is marked correct in the month of September, you'll get a bonus 300 points on top of the usual points you get for having your answer marked correct.

Get ten answers marked correct, that's an extra 600 points!

There's only one catch: You have to tell me which ones you provided.  By posting links to your correct answers below.

Anyone who gets five correct answers also gets this Mission Possible badge.

mission possible


Good luck!



The not-so-fine-print:  It doesn't matter when you posted an answer or when a question was asked. It just matters that the answer was marked correct (not by yourself) in the month of September, 2017.  So if a question was posed in August of 2015, and you provided an answer, but the person who asked it didn't mark your answer until Sept 5, 2017, then it counts toward your total this month. But if you ask a question, answer it and mark your own answer, it doesn't count. 

RFQ to Quotation

We all know we are here because we work on something that is related to Nintex Workflow or Forms, and the reason we use Nintex Workflow or Forms is because it makes our life easier. Things have changed a lot these days, trending towards the cloud, hybrid environment becoming very common that most of us are working on both on premise platform and at the same time on the cloud. Regardless of which platform you use, you will find Nintex helps.


Part 1 of GKK Compressor Industry blog series, I am going to share exactly how I use my hybrid environment to save the efforts for my recent investment - GKK Compressor Industry. GKK Compressor Industry is "Lean Manufacturing" produces world class compressors. Moving towards a Six Sigma company, Simplifying Processes and Reduced Errors falls in its Lean Six Sigma project mission to turn the company into highly effective and efficient company.


RFQ (i.e. Request for Quotation) is one of the key Sales and Marketing processes involving its customers. The figure of RFQ to Quotation shown above, demonstrates how Nintex Forms is use allowing customers or internal sales to fill up an RFQ form powered by Nintex Forms. The output of RFQs are Sales Forecast (i.e. use for Production Planning) and Quotations (i.e. issued to customers). The process is simplified at GKK Compressor Industry, Nintex Workflows automates the RFQ process by getting the unit price for the requested compressor models to provide the prices, and it auto generates an Quotation in Excel Format, and finally it updates the Sales Forecast with the quoted Compressor models. The RFQ process not just simplifies the process with reduced steps, it also eliminates potential human errors by auto generating the required quotations.


The quotation generation is done by simply calling a Nintex Workflow Cloud workflow from its RFQ process powered in its Sharepoint environment. It's worth taking a trip to Nintex Workflow Cloud for a quote generation, as we realized it supports OpenApi (i.e. Swagger) by its Xtensions framework. We make use of the Xtensions to include the Microsoft Graph API connectors in Nintex Workflow Cloud helping us to create quotation based on our pre-designed Excel Quotation template, as we only need some functions to create Excel quotation, we brought in only few Excel related end-points of the Microsoft Graph API.


Microsoft Graph API - Excel

Based on the connectors defined and shown under the Microsoft Graph API - Excel action group, you will notice there is no connector to create or copy excel file, this is because I have made use of Nintex Workflow Cloud's default Sharepoint conector to copy a Excel Quotation Template to a new quotation with the name I provided. The Sharepoint "Copy a file" connector's configuration is shown below.

 Sharepoint connector - Copy a file

Once the new file is created/copied from a pre-designed template, what I will need is basically

  1. "Add table rows" for quotation items
  2. "Delete table rows" for unwanted rows in the excel table
  3. "Update a nameditem" for its value
  4. And so on…


I have attached my swagger file for the Microsoft Graph API - Excel connectors. Few notes to take if you want to implement the Graph API for Excel using the swagger file shared in this blog:

  • To enable the connector, you will need to create an Azure Active Directory application (i.e. here Is my previous blog on how to create one
  • Excel related operation of Microsoft Graph API seems to work only with its "beta" version (i.e. not the "1.0" version) for files resides on Sharepoint library (i.e. I am not sure why and if this is correct, but I only managed to get it work with the "beta" version).
  • There are two ways to create the Azure Active Directory app, one via the new Azure Portal, the other using the old portal (i.e. I only got it works with Active Directory app created by the old Azure portal)


Its our birthday!


The community just passed three years of answering questions, finding answers and sharing knowledge.  And the monthly mission, as usual, gives the gift to you.

bday cake

It's a bit involved, and I'm asking a lot... but hopefully you'll spread the Connect birthday love.

Here's what we ask of you in a SINGLE REPLY BELOW. And DO NOT reply until you're ready to post EVERYTHING.

In your reply below post the following:

  1. A link to a UserVoice suggestion you've voted on
  2. Visit and click "actions" > follow on three sets of release notes
  3. Update your status
  4. Update your avatar (doesn't HAVE to be a real photo of you, but if it is - secret bonus points!)
  5. Tag (using "@" then their name) three people who've helped you in the community.
  6. And then post a snippet showing you on Twitter holding a cupcake, cake or treat of your choosing saying: "Happy birthday #NintexConnect! Three years of connection! #Nintex"


If you do those things and post the images to prove it below, you will win a present!

No. Not a real present. But you'll win this present:

And it comes with 200 very real points in the community to boost your street cred!


Thank you for three years of questions, answers and sharing! May we see many more!

Greetings!! ,


Finally back in South Africa from Namibia!


And I'm excited to bring you part 3 of my Time sheet blog series. If you guys remember in part 2 , I didn't get to cover the Time Sheet form below:



And so in this part I've included a breakdown on how the Time sheet Form has been put together as suggested by David Heineman  


I've done a +/- 29 minute long video explaining the form in greater detail, and also shared some tips and tricks that should help you on your Nintex journey  




Hope you guys enjoy this 


Have a great week everyone!



AuthorPalesa Sikwane
Long Description

An update on Nintex® Workflow Microsoft® Dynamics Live Actions and Native CRM Actions

Affected Product(s)
  • Nintex® Workflow 2013
  • Nintex® Workflow 2016
  • Nintex® Workflow for Office 365 
Release Notes

Nintex Workflow 2013 - Release Notes 

Nintex Workflow 2016 - Release Notes 

Nintex Workflow for Office 365 - Release Notes 

ScreenshotsNintex® Workflow 2013





Nintex® Workflow 2016



Nintex® Workflow for Office 365 ®



Greetings everyone,


I'm pleased to announce an update on both the Microsoft® Dynamics CRM Connector as well as the Nintex® Live Microsoft® Dynamics CRM Actions  


So how do you get this new update?


On premise (2013/2016)
Download and install the latest setup files and follow the update guides below:


Updating Nintex - What you need to know

Product Update Process


Office 365

The update is deployed automatically to our Nintex® Workflow App


What's special about this new update?


Everything stays the same and works the same, but the key difference been that you can now connect to CRM 2016 on premise as well as CRM 2016 Online !




Just two months after we let the world know we passed 5 million views of correct answers, the number has jumped up over SIX million!


More than six million times, someone has viewed an answered question, so the community continues to speed ahead!


That means they may have avoided starting a support ticket. Or they learned something that saved time. Or they passed it on to a colleague and saved that person some time or spared them some frustration.  That's what community is all about!


You can see in this chart, the increase in views, over the last third of the community's existence.


6million views of answered questions


That's due in large part to two things:  

1. When community members remember to click "mark correct" on replies to their questions.  Sure, they could just read replies that pop into their email. But taking the time to return and mark an answer correct is so helpful for other community members.  Read more on that here: Remember To "Mark Correct" 


2. The efforts of the Blue Ribbon Group members of this community. They VOLUNTEER their time to help out trhe community by answering a lot of questions and by marking answers that they know are correct. It's a big help to everyone. Thank you, Blue Ribbon Group!



To learn more about how to get the most out of the community, including using search, finding your way around and earning points, check out the following posts:


Get Involved in the Nintex Community

Making the Most of Community Features

Quick Tip: Get the Most From Your Newsfeed

New Look To Connect!  More Bling For Your Blog!

One central function of a community is to provide useful information for others to resolve their own questions.  In Nintex Connect, a lot of people come seeking answers to questions they have about Nintex Products.  To be a helpful community member,  remember to click "mark correct" on replies to your questions when you get them.


What difference does it make? 


A lot.  Allow me to peel back the curtain a bit...


For starters, marking an answer correct helps that questions show up higher in search results.  That's right, a question with an accepted answer gets treated with greater value in the search algorithm in the community and shows up higher in search results.  

Take a look:

search results


Not even Google gives you that value.  Here's the same search from Google. Sure, it tells you there's community content, but it doesn't tell you what's the most valuable content!

search results w google only


Which is why I ALWAYS recommend that after you land in the community, use the search function for your question and see what results pop up. Chances are someone else has asked/answered a similar question, and you could find the info you seek.


And then in a visual cue, marking a correct answer changes the blue question icon to a green checkmark.  That is a quick, easy way to see at a glance that there's a correct answer!  See:

correct answer pic


So, even though it's convenient to read an email from the community, see a suggestion to try and then go about your business, please take a moment to click the link to the discussion and mark an answer correct if you find one.



For those of us who tried or in the process of evaluating integrating Nintex Workflow Cloud into your solution(s), eventually you will get to the question "Why should I integrate Nintex Workflow Cloud?" especially if you code your solution from scratch using platform such as .NET or Java, writing code will be the default way to support any business logic or process(es) of the solution you are building with such platform. Here is my two cents worth for that question.


Before we get into discussing why, let us get few more examples here. Take another solution platform for the same question - Sharepoint, which is considered as COTS (i.e. commercial out of the box solution) for intranet/collaboration portal. We do not expect lot of coding or customization, by default Sharepoint supports document management with its content type of Document Library, supports creating of custom records using Custom List, when come to automating a document or record created in Sharepoint, it is not going to be easy especially we expected it to be COTS, Nintex Workflow is just the right fit to complement the weaknesses making process automation possible on Sharepoint.


So now the question is when you create your solution from scratch using platform such as .NET or Java, by default you will code everything yourself and it would be much flexible and powerful to you when come to coding. Similar to if you are using Mendix for instance for your solution, it has what it called Micro-flow to support logic behind defined objects, events, etc. in Mendix solutions, why would I consider to integrate Nintex Workflow Cloud since the platform itself supports business logic?


Reasons to integrate Nintex Workflow Cloud to your solution(s)

Take Nintex Workflow Cloud to replace some of the building block of your .NET or Java solution, helps your saving efforts on reinventing the wheels. The building blocks of a custom built solution usually consists of different modules/blocks to handle different function groups, for examples there will be modules such as:

  • Security/Login module
  • User Management
  • UI (Portal Pages, Forms, Views, etc.)
  • Report Management
  • Business Process/Workflows
  • Document Generation/Processing
  • Electronic Signature

e.g: Solution Architecture Diagram of a custom developed solution


1. Workflow Module

In a solution architecture of a custom developed application/solution, we often modularized it into different module/design blocks. Some of the benefits of modular design is that modules could be reused in another solution, some of the modules could be puzzled with ready to go solution(s). The above example of a solution architecture diagram illustrates a need to include a Workflow engine, which should be handled by a ready to go solution such as Nintex Workflow Cloud, which makes a lot more sense than having to develop all the workflow functionality and management from scratch.


2. Document Generation

The above scenario tells us that we will need to build all these modules, of course if you have done one before you could re-use it, if not you will need to spend tremendous effort to write one. Nintex Workflow Cloud comes with some niche and unique features such as Document Generation. Automated document processes are common in today's business processes, majority of the solution requires the creation of document manually using a word template for instance, these processes could be improved by Document Generation features of Nintex Workflow Cloud. If document creation to be automated, one will find it challenging as there ain't many options of API to do that. It become more challenging if one will need to keep the solution up to date with never ending evolving technology of document API. This is a good scenario and good opportunity to pass the job to Nintex Workflow Cloud. The automation of document generation could be passed to Nintex Workflow Cloud,  once it's done, it saves the generated document to a specified shared drive where the initiation program could pick that up from there.


3. System Integration

One of the strength by Nintex Workflow Cloud is the capability and rich set of connectors and actions supporting integration with other systems. Further more, with the compliance to the Swagger standard, it is easy to extend to include connectors to other system that was not already included as part of the default connectors. Capability to integrate with external system is always one of the huge area in a solution, and it usually requires huge effort to build and manage, leveraging that as part of the Nintex Workflow Cloud not just saves the effort to build one, but also provides the flexibility when come to extending of connectors to other systems.


Again, these are just my two cents worth for the question by providing just few examples here. I believe there are many more reasons to explore, such as Public Web Form? if your solution is an Intranet based deployment, most of the time you are not going to expose your so called Intranet solution to the public facing internet, leveraging the Nintex Workflow Cloud's Anonymous web form could a a quick solution to the requirements.


The concept of modular design with integration of Nintex Workflow Cloud as the required Workflow module is not just saving you the effort to creating one from scratch, it helps when you left the solution to a customer after you done the development project, clears off hassles for your customers to maintain or troubleshoot the workflow module. IT platforms are being patched and updated at a fast pace today, patches and updates introduces huge bandwidth of maintenance. Getting Nintex Workflow Cloud to handle some of these functionality minimizes the risk and needs to continue maintaining the code. 

In Part 1 we explored the new Nintex SharePoint Connectors for our Domestic Animal Registration example and successfully published a Public Web Form and added our responses to SharePoint Online which triggered an internal review and approval process. 


Part 2 is all about our backend process to save our users even more time and effort. Remember those paper / email received forms our Chris City Council workers received. Well once they've approved them internally they need to update a central CRM (Salesforce) by adding the record. They either do this by typing directly from that paper form or copy and pasting from the email received PDF. We also need to inform the member of the public that submitted the form and provide them with a Proof of Registration which is a council generated document that we also do manually. So let's get started.


I'm going to create a backend process workflow in Nintex Workflow Cloud with the Start event of External start. This is going to allow me to initiate this workflow via my Office 365 internal process. It's a lot easier to connect directly to the number of SaaS application that I'm working with here as opposed to within my Office 365 tenant (however, Nintex does offer Salesforce and Dynamics CRM connectors as part of the Office 365 product). I've also replicated the variables from part 1 so I can reuse the same info throughout the process.


I'm going to introduce a little smarts here in case I was processing this on batch using Nintex Workflow Cloud Scheduled Start (for purposes of demos I want this to kick off as an entire process but some organisations might want this to be batch). So I'll make sure that only those items in my list that have gone through the Approval process are picked up for processing using the SharePoint Online Query a list action and because this may be a number of items greater than 1 I'll need a Collection variable declared to iterate through.


The next group of actions I need to do are all related to whether an Application has been marked as Approved. I need the process to - 

  • Retrieve application details
  • Generate a unique certificate number
  • Generate a document and store in an electronic file share ( e.g. Box, Dropbox, OneDrive, Google Drive)
  • Create a Salesforce record
  • Contact application via SMS and email certificate
  • Update my SharePoint Online item to Completed


For Each 'Approved' application let's retrieve those details from SharePoint Online and Generate a Certificate Number.


Now we need to Generate that Certificate. It's my pleasure to introduce you to Nintex Workflow Cloud Document Generation. Platform agnostic cloud Document Generation complete with Nintex Document Tagger capabilities. Essentially the Nintex Document Tagger is opening up all the workflow and item property references that I can merge into my cloud stored document at execute. Really clever stuff. I just copy the variable tag I want and paste it into the template. And the awesome news, Document Generation is now General Availability!


OK, unique document generated, simples. Next we need to put this into our structured data repository, Salesforce CRM. Could equally be Dynamics CRM as Nintex have connectors for that. I've created a custom object in Salesforce for my Domestic Animal reg records so we can even find that via our Salesforce connectors. 


We want to be able to attach that generated document to an email for our applications records and we need to check back in with SharePoint Online to update the status of the application now that everything is 'Completed'.



Not long to go. I just need to let me application know that they're animal has been registered with the council and to provide them the certificate of registration. With such a heavy reliance on mobile phones and the instant notification, what we'll do is SMS a note to say the animal has been registered using Twilio, and we'll email the certificate to the applicant.


My very last stage is to publish this workflow, grab that External Start URL so I can initiate it from my Office 365 SharePoint Online Nintex Workflow and then we're done.



Just to recap, parts 1& 2 we've covered a considerable amount of Nintex Workflow Cloud functionality to fully digitize my Chris City Council external application process - 

  • Public Anonymous Forms
  • Create an Item in Office 365 SharePoint Online
  • Initiate a Nintex Workflow Cloud workflow from Office 365 SharePoint Online
  • Referenced that Office 365 SharePoint Online list
  • Generated a document from a cloud stored template with my item data and sent that to our applicant
  • Created a Salesforce record for a custom object
  • Sent SMS and email confirmation
  • Updated an item in an Office 365 SharePoint Online list


Workflows do fail. External forces or unforeseen conflicts arise and force us into troubleshooting mode. Deciphering the root cause of the workflow errors can be time-consuming. When it comes to delivering materials to a support team such as Nintex Support we want the most accurate information possible and we want to get it out the door quickly.


To better respond to my clients needs for rapid workflow error remediation, I've come up with a PowerShell script which will process the ULS logs of an on-premises SharePoint farm, locate Nintex Workflow-specific correlation tokens, and output the series of ULS logs directly aligned to all workflow instances over the given time period, all automatically, with minimal input from you, the SharePoint/Nintex administrator.  Have a look!





To output one ULS log file for every Nintex Workflow instance reported in the trace log by SharePoint over a given period of time.



SharePoint Management Shell/SharePoint PowerShell cmdlets.



  1. StartTime - passed to Merge-SPLogFile to identify the start date and time to start searching
  2. EndTime - passed to Merge-SPLogFile to identify the end date and time to end the search
  3. Version - the Nintex Workflow version number which is logged by SharePoint in the trace log
  4. OutputFolder - where you want the ULS log files saved



  1. NWTokens-[Guid].txt: all Nintex Workflow events which occur between StartTime and EndTime
  2. NWTrace-[Guid].txt: one for each Nintex Workflow instance that appears in the trace log


PowerShell Script


Param (
Write-Host "Locating Nintex Workflow $Version instances between $StartTime and $EndTime"
$guid = [System.Guid]::NewGuid()
$guidString = $guid.ToString()
$path = "$OutputFolder\NWTokens-$guidString.txt"
Merge-SPLogFile -StartTime $StartTime -EndTime $EndTime -Area "Nintex Workflow $Version" -Path $path
$content = Get-Content $path
if($content -eq $null) {
    Write-Host "No Nintex Workflow $Version records found in the ULS logs"
} else {
    $counter = 1
    $table = @{}
    do {
        $token = $content[$counter].Substring($content[$counter].Length - 36)
        if([System.String]::IsNullOrEmpty($token) -ne $true) {
            if($table.ContainsKey($token) -ne $true) {
        $counter = $counter + 1
    } while( $counter -lt $content.Length )
    foreach( $token in $table.Keys ) {
        Write-Host "Exporting Nintex Workflow $Version Log for Correlation: $token"
        $tpath = "$OutputFolder\NWTrace-$token.txt"
        Merge-SPLogFile -StartTime $StartTime -EndTime $EndTime -Correlation $token -Path $tpath


This is a Rev.1 and as such could be greatly improved. Comments/suggestions welcome!

Only recently have I started using the "Document Generation" action instead of "Update Word Document" action.





I didn't realize what I had been missing out on. Suddenly, we can insert tables and images easily, as long as we have our populated variables and templates set-up correctly. And the photos can be auto-resized! AND CAPTIONED! Craziness. This is going to save coworkers a LOT of time!


I've taken care of the latter part (the template) for you, and it's attached to this post. The rest of this blog will teach you how to put it into use!


Let's focus on the idea of having an inspector out in the field, taking photos, and writing captions using a repeating section, all via Nintex Mobile - maybe on an iPad. I'll walk you through how we bring those pieces together, and if you need more detail on anything, there's a "further reading" section at the bottom.



Things of Note: 



  • Doc Gen allows 10 images per docx template
  • Doc Gen is subscription based - if you are on prem you may need to have it activated; if you are on O365 it should be there (I believe with 50 trial Doc Gens but I am not sure)
  • Resizing images requires placeholders - which is why this is offered in a template
  • This article will not cover how to build the images & captions template itself - but - that will be in a future post!
  • This article will also not cover how to deal with an unknown amount of pages - but - also, another future post. Probably part of the same one, actually. Who knows!





Before the Workflow


Before jumping right in, you'll need to do a bit of preparation: 


  • Download the template attached here and upload it into your SharePoint site
  • Download the placeholder0.jpg image and upload it into your SharePoint site
  • Set-up an image library for your photos to go into, and ensure that you have a column for a unique Form ID





Create The Workflow


Our workflow will have 4 main elements:


  1. Send the image attachments to a library;
  2. Go through the image URLs;
  3. Go through the captions;
  4. Build the document via DocGen


1 - Sending the Image Attachments to a Library


  • Use "Copy to SharePoint" action
  • Ensure that you have "Copy Item Metadata" checked
  • Select your image library as the place to copy to



2 - Going Through the Image URLs


  • Set up 10 single line of text variables called Image1, Image2, etc, all the way up to Image10 - however - ensure also that you set the "default value" to the URL of placeholder0.jpg (from our Before the Workflow steps.) If you don't the workflow will error!
  • Set up 1 single line of text variable called Image0 - this is just where we'll move the URLs to & from
  • Set up 1 collection variable for your image URLs
  • Set up 1 num variable for your Index


  • Use the "Query List" action and point it to your Photo library - filter by the unique Form ID, sort by ID, and drop the "Encoded Absolute URL" field into your collection



  • Now that we have our collection, it's time to party! Initialize your Index by setting it to 1 using the "Set Variable" action
  • Create a "For Each" action and set it as follows:
    • Target Collection: Image URLS collection
    • Store Result In: Image0 variable
  • Create a "Switch" action and base it on the Index variable you created. Create a branch for 1 through 10, and an "Other"
  • In each branch, use a "Set Variable" action to set the appropriate numbered variable (ie, Image3 in path 3). Image0 will carry the data we need in each path. The action should read: Image2 is equal to Workflow Data: Image0
  • Use a "Calculate" action to add 1 to the index variable.




3 - Go Through the Captions


This is basically the same as the Image URLs bit, with one main difference... we're dealing with a repeating section now!


  • Create multi line text variables for Cap1 - Cap10.
  • Create 1 multi line text variable called Cap0 (this will be our holding variable)
  • Create 1 num variable to store a count of the items (the children) in the repeating section
  • Re-initialize your num Indexing variable (from before) to 1
  • Use the Repeating Sections UDA to get captions from the repeating section of your form. (Detailed articles on this topic at the bottom of this post.) Be sure to get your caption children into the count variable!
  • Create a "Loop" action and configure it as follows:



  • Add a "Query XML" action and Query your repeating section data (from the UDA)
  • Use //Items/Item[{WorkflowVariable:numIndex}]/NameOfYourCaptionEntryField to get items from your data, and, store it in Cap0
  • Again, just as before, use a switch with paths 1 - 10 and "Other", then set each unique caption variable with the "Cap0" data being passed through.
  • Use the "Calculate" action to add 1 to the Index after each pass


4 - Build the Document via Doc Gen


  • Add the Document Generation action to your workflow
  • Choose the Generation Type of "Original File Types"
  • Click on "Add Document Template" and browse to the template you uploaded earlier
  • Use a workflow data variable for your Output File Name - I suggest using the FormID!
  • Choose an Output location for your document to go to.


Since the template I created for you is already tagged with Cap1 - Cap10 and Image1 - Image10, you don't need ( i think ) to make any changes to see it work with your own form attachments.  ....You probably will want to remove my logos though  


Save & Publish!




Congratulations, you now have a tidy little document with 2 resized pictures per page alongside their captions.  This simple workflow could save your colleagues up to 30 minutes per document generated. Check it out!



Further Reading

That was basically the need of my customer who is heavily working with php and he wants to automate some stuff integrating some online services (including O365...and because now the Sharepoint Online connector is available, it opens a wide new range of capabilities!)


Of course, you can kick an external start workflow from anywhere, any language, any platform, there is no problem with that, but maybe the syntax may not be obvious to everyone on any language.


php... mmm it's been years I hadn't done any code in php . but it's like cycling, you never forget !


You don't need any specifi libraries, like zend or Curl. You can do that with native php.

After a few tries...I got there.

So, here is how you call an NWC external start workflow, from any PHP page.




$data = array(
"se_param2"=> "I said a hip hop the hippie the hippie to the hip hip hop, a you dont stop");

$dataOpt = array(
"callbackUrl"=> "");


$masterData= array(
"startData"=> $data,
"options"=> $dataOpt);


$options = array('http' => array(
'method' => 'POST',
'header' => "Content-Type: application/json",
'content' => json_encode($masterData)

$context = stream_context_create($options);
$result = file_get_contents($url.$token, false, $context);


AND MAGIC HAPPENS... Your NWC workflow will start...

Over the past few weeks I've had the question come up and whilst not a complicated process and there is information available in the Help . I thought I'd quickly put together a blog post that goes through the process of connecting your On-Prem SharePoint environment up to Nintex Hawkeye for the first time.


This post assumes you already have your Nintex Hawkeye tenant provisioned, whether that is a trial or part of your subscription. If you need to get a trial head to Try Nintex for 30 Days - Free! 

  1. Firstly log into Nintex Hawkeye and navigate to the "Nintex data sources". 
    1. Select "Add Nintex data source" - My this post I'm going to add a SharePoint 2013 source
  2. Download the installer package and if you aren't on the SharePoint server, copy it there. The installer needs to be installed on it. 
  3. Extract the Zip file. in there will be 3 files. 
    1. Deployment Guide
    2. Uninstall Script
    3. Executable
  4. Run the executable (Note, if you already have the Nintex Hawkeye collector installed, you can just run the installer and it will update the already installed version. 
  5. The installer will ask for your token and discovery url. You only should need to provide the token. This token needs to be copied form the Nintex Hawkeye Portal (as you can see above)
  6. Once filled in, select next. It will prompt for an IIS reset, which you can choose to do then or later. 
  7. Assuming you let it reset IIS, it will then finish off the installer and open Central admin
  8. Click on the "Nintex Hawkeye Management" link in the menu 
  9. Click Manage database and on the Manage Database page, specify the required database settings, and click Create.
  10. In Central Administration, click "Application Management", and then "Manage services" on server under "Service Applications".In the Services column, locate the "Nintex Hawkeye Service", and click Start.

  11. In Nintex Hawkeye, click Nintex data sources on the navigation bar. You should now see your environment being displayed with a "Authorize" button. Click the Authorize button to connect the data source.

  12. Activate on the Web apps you wish to collect data from. This particular feature is to activate the Nintex Hawkeye Beacons that allow you to collect specific data within your process. But we will talk about that in a whole other post. 
  13. In Central Administration, select System Settings, Manage Farm Features. Locate the Nintex Hawkeye feature and click Activate.


    Lastly, activate the "Nintex Hawkeye custom workflow actions" This needs to be done on the relevant web apps. Reference this link for instructions


And that's it. In your Nintex Hawkeye tenant now if you take a look at the "Activity Log" in the top menu you will see the status of the  data import


You are now ready to start creating your Lenses. But we will leave that for another post.. 


If you haven't already signed up for the webinar on Wednesday, June 14 10:00– 10: 45 am PST  that will go into a little more detail, head to Become a Nintex Ninja 


Until next time