Skip navigation
All Places > Tech Blog > Blog
1 2 3 Previous Next

Tech Blog

240 posts

One of the challenges shared to me recently is to look at how Microsoft Flow could be integrated with Nintex Workflow Cloud, reason being Nintex Workflow Cloud provides enterprise level workflow capability, and allowing extended capability of custom workflow connectors via the OpenAPI/Swagger definitions.

 

In this article I am going to share how we can integrate Nintex Workflow Cloud to a Microsoft Flow, leveraging the Nintex Workflow Cloud's Xtension framework to include Google Calendar connector for syncing Microsoft Outlook calendar with Google Calendar.

 

Calling Nintex Workflow Cloud from Microsoft Flow

1. For the purpose of calling Nintex Workflow Cloud from Microsoft Flow, I have created a Nintex Workflow Cloud workflow with external start event as shown in below diagram. I have also included parameters I wanted to bring over from Outlook Event to sync with Google Calendar Event (i.e. Event Title, location, ID, Start date-time and End date-time in my example).

 

2. Once the workflow is published, it gives us details on how the workflow could be triggered from external system(s). What we need from this published workflow is the URL as shown in the diagram below:

 

3. I have created a blank Microsoft Flow with only two steps added in my example, first step is the trigger "when a new event is created (v1)" of Outlook Event. The second step is the HTTP + Swagger as shown below.

 

4. Paste the URL we have gotten from the published Nintex Workflow Cloud from step 2 above to the "SWAGGER ENTPOINT URL" as shown below:

 

5. The "HTTP + Swagger" action will be refreshed with the required paramaters as we have defined in Nintex Workfow Cloud. We can now supply the values to pass from Outlook Calendar event to Nintex Workflow Cloud as shown in the diagram below.

 

Extend Nintex Workflow Cloud with Google Calendar connectors

When this article is being written, Nintex Workflow Cloud does not by default provide Google Calendar connectors. However, the Xtensions of Nintex Workflow Cloud provides the flexibility to include any connectors we need, as long as it complied to the OpenAPI/Swagger standard.

 

To do this, here are the steps I followed to include connectors of Google Calendar APIs.

 

1. Identification of Google Calendar APIs.

Google provides rich APIs to its applications/services, including Google Calendar APIs. The reference to the Google Calendar API provides all the details we need for the purpose. These details are such as the end point url, HTTP Request, and Parameters for the call.

 

 

2. The Swagger file we going to create required us to specify the API Scope as well, this is also provided in the reference document. In the Google Calendar example, what we need for the scope is as shown in the diagram below.

 

3. Prepare the Swagger file and save it with json extension for importing to Nintex Workflow Cloud Xtensions.

{
    "swagger": "2.0",
    "info": {
        "version": "1.0.0",
        "title": "Google Calendar API",
        "description": "Google Calendar API"
    },
    "host": "www.googleapis.com",
    "basePath": "/calendar/v3",
    "schemes": [
        "https"
    ],
    "produces": [
        "application/json"
    ],
    "paths": {
        "/calendars/{calendarId}/events": {
            "post": {
                "tags": [
                    "Insert new event"
                ],
                "summary": "Insert Event",
                "description": "Insert a new event",
                "operationId": "insert",
                "parameters": [
                    {
                        "in": "body",
                        "name": "body",
                        "schema": {
                            "$ref": "#/definitions/Event"
                        }
                    },
                    {
                        "name": "calendarId",
                        "type": "string",
                        "in": "path",
                        "description": "Google Calendar ID",
                        "required": true
                    }
                ],
                "responses": {
                    "200": {
                        "description": "OK",
                        "schema": {
                            "$ref": "#/definitions/Event"
                        }
                    }
                },
                "security": [
                    {
                        "Oauth2": [
                            "https://www.googleapis.com/auth/calendar"
                        ]
                    }
                ]
            }
        }
    },
    "definitions": {
        "Event": {
            "type": "object",
            "properties": {
                "start": {
                    "description": "The (inclusive) start time of the event. For a recurring event, this is the start time of the first instance.",
                    "type": "object",
                    "properties": {
                        "date": {
                            "type": "string",
                            "format": "date"
                        },
                        "datetime": {
                            "type": "string",
                            "format": "date-time"
                        },
                        "timezone": {
                            "type": "string"
                        }
                    }
                },
                "end": {
                    "description": "The (inclusive) end time of the event. For a recurring event, this is the end time of the first instance.",
                    "type": "object",
                    "properties": {
                        "date": {
                            "type": "string",
                            "format": "date"
                        },
                        "datetime": {
                            "type": "string",
                            "format": "date-time"
                        },
                        "timezone": {
                            "type": "string"
                        }
                    }
                },
                "location": {
                    "description": "location of event. Optional.",
                    "type": "string"
                },
                "summary": {
                    "description": "Event title",
                    "type": "string"
                },
                "description": {
                    "description": "Description of the event. Optional.",
                    "type": "string"
                }
            }
        }
    },
    "securityDefinitions": {
        "Oauth2": {
            "authorizationUrl": "https://accounts.google.com/o/oauth2/auth",
            "description": "Oauth 2.0 authentication",
            "flow": "implicit",
            "scopes": {
                "https://www.googleapis.com/auth/calendar": "Read and Write access to Calendars",
                "https://www.googleapis.com/auth/calendar.readonly": "Read access to Calendars"
            },
            "type": "oauth2"
        }
    }
}

 

4. Once we have the required Swagger file, we can add it to the Xtensions from the Xtensions page of the Nintex Workflow Cloud dashboard as shown in the diagram below.

 

5. As the Swagger includes the Security Definitions to use OAuth, we will need to provide the required Security details as shown in the diagram below. Take note that in our example here, we will select "Google" for the Security value from this page. I have shared how to get the Client ID and Client Secret in the "Obtain OAuth 2.0 credentials from the Google API Console" section of my previous blog post Using OAuth 2.0 to access other cloud services from NWC

 

6. Once the required values of Security, Client ID, and Client Secret have been entered, click Next button to continue, where we will need to specify a Icon, Name, and Description for the Connector. I have used Google Calendar and Google Calendar API respectively for the Name and Description values in my example as shown below.

 

7. The new Xtensions will be added as shown below

 

8. We may now edit the Nintex Workflow Cloud workflow to include the newly added Google Calendar connector for the purpose of adding an event to Google Calendar. Do take note that we need to add a connection and grant Nintex Workflow Cloud access to the Google Calendar for the purpose, this is required as we need to specify a connection to be used in the connector actions.

 

With the same approach, we may include all the required Google Calendar API end-points to the Nintex Workflow Cloud.

A three months Tentative Production Plan helps procurement division to plan what material to acquire to support the production of compressor at the production site. There are two types of materials to be shipped to a production plant,

  1. Movement of material from one plant to the destination production plant/storage
  2. Delivery Order directly from its supplier to the plant/storage

 

Nintex Mobile application is being used at the plant supporting Goods Receipt at the point where materials are being received. This minimized the need for personnel at the plant to receive goods by noting it on paper, and the need to go back to the office desktop to update the good receipts using a desktop computer with SAP console installed. Updates of good movement is now instantly done over at the point of good receipts using Nintex Mobile, and data are being updated immediately to the SAP system powered by Nintex Workflow with Workflow Connectors provided by Theobald Software.

 

For the purpose, a "Goods Receipt" form was created using Nintex Form as shown in the diagram below, Nintex Form features bar code or QR code scanning, eliminates the potential human mistakes of typing in long serial number of goods received. Nintex Form could easily be prepared to surface on different devices without much efforts. In this article, as our focus is to look at the integration of Nintex and SAP, we going to keep the form explanation simple here.

 

 

Once the data is captured and submitted to the Goods Receipt list, the associated workflow will be triggered to process the data by posting the data to a remote SAP system. This was done with simply using the ERPConnect Services connectors provided by Theobald Software. ERPConnect by Theobald comes with a set of ready to use connectors as shown in the diagram below.  

 

In this requirement, I have made use of the “Call SAP function” action, it provides full capability to integrate with SAP by simply calling all the available SAP functions, plus the “Z” functions. One thing I find it easy is Theobald Software has a complete documentation and tutorials available on-line (i.e. OnlineHelp - Theobald Software GmbH ) that helps me to do what I need to do for the intergration project. Diagram below shows the "Call SAP Function" action configuration, followed by a table with values I have supplied to the action. 

 

Table below shows the values i have passed to the required parameters of the "Call SAP function" action. Take note that in my scenario i have fixed some of the values to simplify for the demo purpose. In an actual scenario, we will need to substitute the values reflecting what we have in our form/list design.

 

GOODSMVT_CODE
GM_CODE05
GOODSMVT_HEADER
PSTNG_DATEfn-FormatDate(Current Date, yyyyMMdd)
DOC_DATE
Tables: GOODSMVT_ITEM
MATERIALR-B209
PLANT1100
STGE_LOC0001
MOVE_TYPE501
ENTRY_QNT
Output
output.GOODSMVT_HEADRET.MAT_DOCMAT_DOC

 

This is working great so far if we only update one material a time through the "Call SAP Function" action. In our form we have a repeating section where we allow to input more than one material model at any time. Luckily, the action has taken this into consideration as well, this is where we will need to use the "Additional XML table input" parameter of the table section in the "Call SAP Function" action. Theobald-software's help scenario has provided a very good example on how this could be configured  OnlineHelp - Theobald Software GmbH

 

Following the example provided in OnlineHelp - Theobald Software GmbH, I have added another Query XML action to wrap the required format of the XML table as below

 

The final XML content to be passed to the "Additional XML table input" should look similar to the below (i.e. substitute the variables or Item properties with what reflects your design). 

 <?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<TABLES><TABLE name="GOODSMVT_ITEM">
<GOODSMVT_ITEM>
     <MATERIAL>{WorkflowVariable:Material}</MATERIAL>
     <PLANT>{ItemProperty:Plant}</PLANT>
     <STGE_LOC>{ItemProperty:Storage_x0020_location}</STGE_LOC>
     <MOVE_TYPE>{ItemProperty:Movement_x0020_type}</MOVE_TYPE>
     <ENTRY_QNT>{WorkflowVariable:Qty}</ENTRY_QNT>
  </GOODSMVT_ITEM>
  <GOODSMVT_ITEM>
     <MATERIAL>{WorkflowVariable:Material}</MATERIAL>
     <PLANT>{ItemProperty:Plant}</PLANT>
     <STGE_LOC>{ItemProperty:Storage_x0020_location}</STGE_LOC>
     <MOVE_TYPE>{ItemProperty:Movement_x0020_type}</MOVE_TYPE>
     <ENTRY_QNT>{WorkflowVariable:Qty}</ENTRY_QNT>
  </GOODSMVT_ITEM>
</TABLE></TABLES>

 

Here is how the final "Call SAP function" action look like. The formatted XML content is set to valrible "XMLInputGoodsmvt_Item" which is assigned to the "Additional XML table input" field of the action.

On October 1, 2018, Nintex will no longer support Microsoft Account authentication for Nintex Mobile.

Why are we making this change? 

In June 2016, Nintex released the Nintex Mobile Multiple Account Profile feature that provided a means of aggregating forms and tasks from multiple sources without the need of Microsoft Account authentication.  As a result, Nintex will end of life Microsoft Account authentication. 

As part of this transition, Nintex is notifying existing partners and customers to help prepare for this upcoming change.  

Impacted customers are organizations that currently uses Microsoft Account authentication with:

  • end users using Nintex Mobile application; or
  • existing Nintex partners and customers that have built and deployed Nintex App Studio mobile applications enabling Microsoft Account.

If your customer solutions do not use Microsoft Account across any of these scenarios and does not use Microsoft Account for Nintex Mobile account authentication, no further action is required

How does this impact me?

This change impacts Nintex partners and customers that are currently using Microsoft Account to support the following scenarios:

 

  • Enterprises using the Microsoft Account technology to extend access to their on-premises SharePoint environment to Nintex Mobile;
  • Nintex Mobile users using a Microsoft Account as a means of account aggregation; or
  • Nintex partners and customers who have deployed and currently maintain Nintex App Studio applications that enable end users to authenticate using Microsoft Account.

How can I prepare for this change?

Ahead of the October 1, 2018 end of support for Microsoft Account, Nintex has created Knowledge Base articles that offer alternative approaches to using Microsoft Account:

Nintex is working to communicate these changes through a variety of channels.  These communications will target system administrators and designers identified as potentially using Microsoft Account. 

These planned changes include:

  • As of September 1st 2017, Nintex Mobile users looking to add a Microsoft Account are notified in app that they cannot authenticate using this method. This feature has been removed for new Microsoft Accounts.
  • In early 2018, a release of Nintex Forms for SharePoint 2010/2013/2016 will include messaging in Central Administration. In October 2018, Nintex will remove the Microsoft Account feature from Central Administration.
  • In calendar Q3 2018, Nintex is planning to apply in-app messaging targeting existing Nintex Mobile users. System Administrators should direct their end users to take the appropriate action detailed in the Knowledge Base article.
  • Also in calendar Q3 2018, Nintex is planning to apply in-app messaging targeting existing Nintex App Studio users. System Administrators should direct their end users to take the appropriate action detailed in the Knowledge Base article.

Please email Nintex Support or visit https://support.nintex.com/ with any questions. 

Greetings everyone  

 

You might remember my previous post where I illustrated how you can dynamically build a list of approvers and assign tasks to them in a loop via a Nintex® Workflow. In this post I've included an example  .

 

 

AuthorPalesa Sikwane
Long DescriptionWant to Create a list of approvers dynamically using Nintex Workflow? This is the post for you!
Dependencies

Nintex® Workflow for Office 365®, SharePoint Online® (Office 365®)

Support Info

Palesa Sikwane

Additional Information

N/A

 

My previous post Assigning Tasks Dynamically Using Start a Task Process in NWF0365 can be summarised as: 

 

1. Querying a List of Approvers

 

2. Assigning Tasks for Each Brand Dynamically

 

Step 1: Querying the list of approvers

I have configured a list containing my Approval matrix per brand name (I went with cereal brands as an example   ):

 

Brand NameApprover NameDeputy Name
WeetbixPalesa Emmanuel SikwaneRichard Roe
KellogsRichard RoePalesa Emmanuel Sikwane
Jungle OatsJane DoeJane Doe

 

 

This is the list I query in the workflow using the Query List Action, coupled with a few other actions. So i've built a Workflow containing a State Machine, and 2 States:

 

1.  Initial and ;

2. Assign Tasks. 

 

In the Initial State I perform the following steps:

 

ActionDescription

Here I set my workflow status column to : 
"Workflow Started Successfully"

Here I query the Approval Matrix list using the following configuration:

 

 

 

 

 

Note:

Here I simply query all items in my Approval Matrix list (returning 100 items) and output the following data into collection variables:

 

- ID of all my brands (into collBrandID)

- Brand Names (into collBrandName)

 

I also output the number of results returned into a variable called numBrands

Here I log to my history list how many Brands are returned in numBrands
Here I used a Set Next State action to jump to the next state of my workflow called Assign Tasks.

 

Step 2: Assigning Tasks for each Brand Dynamically:

 

This is the second and final state of my workflow and it has the following configuration:

 

ActionDescription

Here I set my workflow status column to:

" Assigning Tasks "

 

 

 

 

 

ActionDescription

The next step is a For Each which will allow us to iterate through the information returned in our  step in our Initial State.

 

This step is configured as follows:

 

Note:

 

- Here I basically iterate through my collBrandID collection which stores all of the ID's of my Brands in the Approval Matrix list. 

- I then set my Output value to a variable called BrandID to store the current brand I am iterating through in the loop. 

- I keep track of each Brand ID in a variable called BrandIndex (which automatically changes when the For Each iterates to the next BrandID)

 

Just a couple of notes on collections: 

- Collections ALWAYS have an Index beginning at 0

- Collections store multiple values into a single collection variable. When I think of collections I think of data been stored in a block. For example a collection containing 3 brands would look like:

 

 

And if we put the index next to each Brand Name(and applying the rule that indexes begin at position 0)

 

Weetbix is at position 0

Kellogs is at position 1
Jungle Oats is at position 2

So we can at any time pull out a specific value (using a Get Item from Collection action) OR iterate through all the data using the index to help us keep track of each value in the collection using a For Each action, which automatically increments the index for us as it reads each value out of the collection.

 

Note:

 

- In this step i get the ID of each Brand because its Unique, meaning that you'll never get items with the same ID in the SAME list

 

- Collections store multiple pieces of information, which are usually separated with a semi colons or in some cases commas (i.e. " ; " or " , ") and square brackets (i.e. " [ " or " ] "  ). This makes it easier for Nintex Workflow to separate each piece of data when we iterate through it or pull out a specific value.

In this step based on the Index, I read the each item out my collection and get the corresponding Brand Name. Remember in my For Each i get the Brand ID and iterate through those

 

Note:

This runs in the loop and will repeat x number of times, where x is the number of items we find.

In this step for record purposes, I log the current Brand Name and the Brand ID that the workflow has picked up

Here I query my Approval Matrix list where I take the current Brand ID, to query my Approval Matrix and return in collection variables:

 

- Approver Name for the specific Brand

- The name of the Deputy for that specific Brand

 

Note:

We're using the Brand ID as a filter which is Unique, and will ensure that for the Approver Name and the Deputy; although both are stored in collection variables; there will be one value for each, and because we expect one value, that value will always have an Index of 0 . 

I then log the following to my workflow history (for record purposes):

 

- Brand Name

- Approver

- Deputy

1. I use a Parallel Block to run the next set of steps at the same time, in this example this is perfect, as the SAME steps need to occur for both the Approver and the Deputy we pick up when we query our Approval Matrix.

 

Note:

Parallel blocks are great for structuring your workflow, but also ensuring that things happen at the same time, in my example I used this for that purpose but also to make it easier for me to explain the workflow. 

 

 

2. I then(for both the Approver and the Deputy) take the collection variable(s) returned and set them to their corresponding text variables (I use this data later on in my regular expressions below).

 

3. In this first regular expression I remove the square brackets that are part of my collection where I return the Approver Name as well as the Deputy. I replace these with blank text. I set the following configuration:

 

 

 

 

4.In this second regular expression I take the result returned above and replace my commas with semi colons. I do this because when you specify multiple email 

they are usually separated by semi colons, and if you guys remember in this post I want to Assign Tasks to multiple people dynamically; and to do this I use my loops to deteremine who should get the tasks and build the list or task participants using . I use the following configuration:

 

The last thing I do in the loop is use a Task Process to Assign my tasks dynamically to the email addresses i've determined in the previous step. The action is configured as follows:

 

I set the Assign options as follows:

 

 

Note:

 

As you guys can remember in my previous post the Start a Task Process workflow action is aimed towards Assigning a task to a group of Users, and allows workflow designers to specify task assignment criteria (Assign tasks all at once or in serial) as well as completion criteria (Wait for all responses, Wait for first response, Wait for specific response or Wait for percentage of a response). 

 

This functionality allows us to control how we would like to Assign the tasks to Approver and the Deputy as required, i.e.

 

- In series (one after the other); this option works really well if Assign Tasks to users in a SharePoint Group, this has been discussed in this post here,

- In parallel (all at once) 

 

The advantage of this is that it will allow you to be dynamic  . So if you have a complex process that requires you to cater for both scenarios without having to make changes to the workflow this can be utilised really well. 

 

Also now that we have the schedule workflow feature available on office 365, Brad Orluk has a good blog post on this new feature titled: Tee Up Your Work - Scheduled Workflows Are Available for Office 365! ; be sure to check it out

 

 

In any case i'll  probably cover this in a follow up post  to show you guys how you can use this to your advantage. 

 

 


I hope this helps someone out there

 

Cheers

I am going to demonstrate a Production Planning process that is powered by Nintex Workflow in this blog post. Before we get further into the production planning process, let us recap what we have gone through together in my previous sharing in Part 1 of GKK Compressor Industry (i.e. RFQ to Quotation process). In the Sales and Marketing division, Product Catalog is being used to create RFQ, which in turn generates quotations issued to customers. Every quotation issued to customer updates the Sales Forecast with the increased number of compressor model to be delivered.

 

Being a Lean Manufacturing, Production Planning is crucial to GKK Compressor Industry, the critical success factor is produced only what is needed to be delivered on time. It keeps no unnecessary inventory to minimize the waste of inventory space. Tentative Production Plan is always three months ahead of current month, helps the Procurement to keep the Material Management efficient, knowing what material and quantity of material to order with advance knowledge of the requirement from the production.

 

Key techniques in the production planning to be shared:

  1. Nintex Form Web Part embedded to the "Draft Production Plan" list/page
  2. "Loop" action alternative for immediate execution
  3. Formatting a monthly calendar with CSR

 

The Production Plan

The Production Plan is a plan used in the production division with details on what models and quantity of compressor to be produced for a particular month. A production line has its daily production capacity that is dependent on machine and human resource capacity. If a production line can go up to 60 pieces of compressor per calendar day with full capacity running three work shifts, two work shifts will give capacity of 40 pieces.

 

To get the Actual Production Plan, we draft a production plan from monthly sales forecast with details on what models and quantities to be delivered. The diagram below demonstrates how it's being created powered by Nintex Workflow. The sample plan shows how the quantity of each model to be produced spread over different days with daily capacity of 30 pieces per day for October 2017. 

To produce a total of 71 pieces of compressor for the first model (i.e. SVT125-35), the work has to be spread into three days, with the third day producing the remaining 11 pieces required to make it 71, since on the third day we produced only 11 pieces for the model, it has the remaining capacity of 19 to produce the next model in line (i.e. SHT277NA in the example).

 

1. Nintex Form Web Part embedded to the "Draft Production Plan" list/page

I find Nintex Form Web Part very useful especially when I need to make Sharepoint page interactive, but it wasn't discussed a lot. The Draft Production Plan is just one of the solutions I make use of Nintex Form Web Part to get user to specify or collect Production Plan parameters (i.e. Month, Year, Capacity, and to or not to include week ends as working days for the production plan we are drafting). To include that, I simple embed the "Create Production Plan" workflow's start form to the "Draft Production Plan" view/page as shown below. Once the form is submitted (i.e. with the Start button in the example), the "Create Production Plan" site workflow will be triggered, and the Draft Production Plan list will be refreshed with the production plan that was drafted by the site workflow.

 

2. "Loop" action alternative for immediate execution

The algorithm I used for drafting a production plan is summarized as below. I have then realized, it took hours for the workflow to complete a "Production Plan", reason being the "Loop" action will be executed every 5 minutes by default as it is executed by the Sharepoint Timer Job, even If I configured the Timer Job to run every 1 minute (i.e. minimum interval by Sharepoint), it still take a long time for the workflow to be completed.

  Prepare an empty "Draft Production Plan" view (delete all the existing list items)
  Query the "Sales Forecast" for models to be produced for the specified month
  For-each models in the list
     Create a list item for the model to be produced
     Split the total quantity for the model into the number of days required
     Loop through each calendar day of the production month
            Update the daily quantity for the model to the list item (i.e. exclude/include week end)
     End Loop
  End For-each

We have no choice but to consider to use the "For Each" iteration. Even if you use the State Machine, it is still depending on the Timer Job interval constraint. In order to do that, we will need to calculate the number of loop required, and create a collection of the required number to use the For-each iteration. So, if the quantity to be produced is 71, 71 divide by daily capacity of 30 equals to 2.3667. We get the whole number of 2 leaving behind the fraction, as 2 will be used as collection index of 2 for three iteration (i.e. index starts at 0). This is true if we always start a day with the new daily capacity (e.g. 30), but we will need to consider what if capacity for the day is not always starts from the beginning. In the sample production plan above, the quantity of 22 for model SNT207V is taking 2 days to produce, because the capacity of the day it starts the production was 1 (i.e. 30-29=1) as 29 was used to produce the model prior to it. To get the right count of iteration required, we will need to add the remaining quantity of the last model before we divide the quantity with the capacity, the two calculation actions shown below give the right formula to get the required number of iteration.

 

3. Formatting a monthly calendar with CSR

From the above Draft Production Plan example, I have customized the "Draft Production Plan" custom list into a calendar look and feel, by coloring the columns representing "week end" into grey. The Custom Side Rendering (i.e. CSR) is a feature by Sharepoint is being used to get the calendar view.

SP.SOD.executeFunc("clienttemplates.js", "SPClientTemplates", function() { 
     SPClientTemplates.TemplateManager.RegisterTemplateOverrides({
          OnPostRender: function(ctx) {
               var rows = ctx.ListData.Row;
               var month = rows[0]["Month"];
               var year = rows[0]["Year"];
               var weekends = [];
               var date = new Date(year, month, 0);
               var lastDay = date.getDate();
               for (var d=1;d<=lastDay;d++)
               {
                    date.setDate(d);
                    if (date.getDay()==0 || date.getDay()==6){
                         weekends.push(d);
                    }
               };
               for (var i=0;i<rows.length;i++)
                    {
                         var rowElementId = GenerateIIDForListItem(ctx, rows[i]);
                         var tr = document.getElementById(rowElementId);
                         for (var j=0, len=weekends.length; j<len;j++){
                              var td=tr.cells[weekends[j]+2];
                              td.style.backgroundColor = "#eeeeee";
                         }
                    }
               }
     });
});

I have include the script as JSLink shown below

 

AuthorPalesa Sikwane
Long Description

Do you need to scan through a SharePoint Sites contents and apply some logic based on the existence of a content type? Or maybe even go as far as scheduling this? Then this post is for you!

Dependencies
  • Nintex® Workflow 2010
  • Nintex® Workflow 2013
  • or Nintex® Workflow 2016

 

**And the applicable SharePoint® version for the Nintex® Workflow versions above. 

Support Info

Palesa Sikwane

Additional Information

N/A

Product used

Nintex® Workflow 2016 | Version: 4.2.0.10 - International

 

Greetings from a warm and sunny side of Johannesburg, South Africa! It's finally Spring time in this side of the world!

 

First things first, I would like to thank Michelle Barker for coming up with this question and requirement!

 

 

As you guys would know, reusable workflows have the following properties:

  • Are associated with SharePoint® Content Types
  • Because they are linked to a content type, they can be made available wherever those content types are used in a site collection


Some of the advantages of using a reusable workflow are:

  • Having one Central place to make changes and publish a workflow and push it down to every list or document library
  • Making use of site templates that have content types with workflows linked to them

 

Some of the disadvantages of reusable workflow:

  • Inability to schedule a reusable workflow out of the box
  • Cannot be tied to a site workflow meaning that you cannot really schedule it using a site workflow. 

 

What if then you were faced with a requirement that needs you to run a scheduled workflow based on a content type which could span across different lists and libraries? Maybe you have a requirement to send out reminder(s) at a certain frequency for certain document types? 

 

How would you go about it?

Well there is a way that involves:

 

1. Finding all of your lists on your site

2. Getting all linked content types for each list

3. Per content type, Switch and apply your logic.

 

I built this using the following site workflow:

 

Step #Workflow ActionDescription
1.

Call web service action (Get List Collection - Internal Names)

  • This step involves calling the GetListCollection SharePoint webservice:
    • This allows us to get a collection of all the Lists and Libraries in our current site. The collection will store the InternalName of our lists, or what some of you might refer to as the GUID.
  • I've configured this workflow action as follows:

    Configuring Get List Collection Webservice to get the InternalName

 

Note: I've configured this to actually return the InternalName of my lists or the GUIDs in a collection variable called ListCollection. You will see how and where we use this later...

2.Collection Operation

Here I simply count how many lists are in my collection and store this in a variable called No Lists:

 

 


Note: Note that I also log my result within the same action.

3. For Each

This is my first For Each where I basically loop through my ListCollection I retrieved in Step 1, and store the InternalName or the GUID of each list in a variable called ListGUID. The configuration is as follows:

 

3.1

Log in history list

Here I basically log the current lists index in my collection(ListCollection) as well as the ListGUID:

3.2

Call web service (Get List Content Types)

Next, I call another SharePoint Web Service called GetListContentTypes based on the current ListGUID that my For Each in Step 3 is looking at.

 

I then store the results in a collection variable called List Content Types. This step is configured as follows:

 

Note: The results returned here are in XML, and one of the NODES in the XML contain the Name of the content type, which is what we want

3.3

Query XML

Seen that in Step 3.2 I get returned some XML, I simply query the XML in this step to get the Display Name of my content types and store these in a collection variable called Content Type Names. I've configured this step as follows:

4

For each

This is my second For Each; which allows me to iterate through each content types name stored in my collection Content Type Names:

4.1

Log in history list

Here I simply log the name of my content type:

4.2

Switch (per content type)

Note that I haven't put this in my workflow, but this is where the magic happens for you guys. 

 

So because we've found content types for any list or library across our site with the previous steps, we can use a conditional action such as a Switch to put in some workflow logic as required per content type! 

 

Running this workflow will give you the following result in your workflow history:

So as you can see my Workflow scans the current site, and for each list and library returns the associated content types. 


Feel free to download this and play around with it, and do what you will with it   Maybe even make it a User Defined Action?  

 

Let us know how you guys used this!

 

I hope this helps someone out there! 


Cheers

 

Several community members came close, but didn't quite get this one, so I'm extending it through the month of October!

 

This month, instead of a special errand, we're going to reward you for what you do anyway. Think of it as a bonus.

For every fifth time your answer is marked correct in the month of September, you'll get a bonus 300 points on top of the usual points you get for having your answer marked correct.

Get ten answers marked correct, that's an extra 600 points!

There's only one catch: You have to tell me which ones you provided.  By posting links to your correct answers below.

Anyone who gets five correct answers also gets this Mission Possible badge.

mission possible

 

Good luck!

 

 

The not-so-fine-print:  It doesn't matter when you posted an answer or when a question was asked. It just matters that the answer was marked correct (not by yourself) in the month of September, 2017.  So if a question was posed in August of 2015, and you provided an answer, but the person who asked it didn't mark your answer until Sept 5, 2017, then it counts toward your total this month. But if you ask a question, answer it and mark your own answer, it doesn't count. 

RFQ to Quotation

We all know we are here because we work on something that is related to Nintex Workflow or Forms, and the reason we use Nintex Workflow or Forms is because it makes our life easier. Things have changed a lot these days, trending towards the cloud, hybrid environment becoming very common that most of us are working on both on premise platform and at the same time on the cloud. Regardless of which platform you use, you will find Nintex helps.

 

Part 1 of GKK Compressor Industry blog series, I am going to share exactly how I use my hybrid environment to save the efforts for my recent investment - GKK Compressor Industry. GKK Compressor Industry is "Lean Manufacturing" produces world class compressors. Moving towards a Six Sigma company, Simplifying Processes and Reduced Errors falls in its Lean Six Sigma project mission to turn the company into highly effective and efficient company.

 

RFQ (i.e. Request for Quotation) is one of the key Sales and Marketing processes involving its customers. The figure of RFQ to Quotation shown above, demonstrates how Nintex Forms is use allowing customers or internal sales to fill up an RFQ form powered by Nintex Forms. The output of RFQs are Sales Forecast (i.e. use for Production Planning) and Quotations (i.e. issued to customers). The process is simplified at GKK Compressor Industry, Nintex Workflows automates the RFQ process by getting the unit price for the requested compressor models to provide the prices, and it auto generates an Quotation in Excel Format, and finally it updates the Sales Forecast with the quoted Compressor models. The RFQ process not just simplifies the process with reduced steps, it also eliminates potential human errors by auto generating the required quotations.

 

The quotation generation is done by simply calling a Nintex Workflow Cloud workflow from its RFQ process powered in its Sharepoint environment. It's worth taking a trip to Nintex Workflow Cloud for a quote generation, as we realized it supports OpenApi (i.e. Swagger) by its Xtensions framework. We make use of the Xtensions to include the Microsoft Graph API connectors in Nintex Workflow Cloud helping us to create quotation based on our pre-designed Excel Quotation template, as we only need some functions to create Excel quotation, we brought in only few Excel related end-points of the Microsoft Graph API.

 

Microsoft Graph API - Excel

Based on the connectors defined and shown under the Microsoft Graph API - Excel action group, you will notice there is no connector to create or copy excel file, this is because I have made use of Nintex Workflow Cloud's default Sharepoint conector to copy a Excel Quotation Template to a new quotation with the name I provided. The Sharepoint "Copy a file" connector's configuration is shown below.

 Sharepoint connector - Copy a file

Once the new file is created/copied from a pre-designed template, what I will need is basically

  1. "Add table rows" for quotation items
  2. "Delete table rows" for unwanted rows in the excel table
  3. "Update a nameditem" for its value
  4. And so on…

 

I have attached my swagger file for the Microsoft Graph API - Excel connectors. Few notes to take if you want to implement the Graph API for Excel using the swagger file shared in this blog:

  • To enable the connector, you will need to create an Azure Active Directory application (i.e. here Is my previous blog on how to create one https://community.nintex.com/community/tech-blog/blog/2016/10/20/microsoft-graph-api-from-nwc)
  • Excel related operation of Microsoft Graph API seems to work only with its "beta" version (i.e. not the "1.0" version) for files resides on Sharepoint library (i.e. I am not sure why and if this is correct, but I only managed to get it work with the "beta" version).
  • There are two ways to create the Azure Active Directory app, one via the new Azure Portal, the other using the old portal (i.e. I only got it works with Active Directory app created by the old Azure portal)

 

Its our birthday!

 

The community just passed three years of answering questions, finding answers and sharing knowledge.  And the monthly mission, as usual, gives the gift to you.

bday cake

It's a bit involved, and I'm asking a lot... but hopefully you'll spread the Connect birthday love.

Here's what we ask of you in a SINGLE REPLY BELOW. And DO NOT reply until you're ready to post EVERYTHING.

In your reply below post the following:

  1. A link to a UserVoice suggestion you've voted on
  2. Visit and click "actions" > follow on three sets of release notes
  3. Update your status
  4. Update your avatar (doesn't HAVE to be a real photo of you, but if it is - secret bonus points!)
  5. Tag (using "@" then their name) three people who've helped you in the community.
  6. And then post a snippet showing you on Twitter holding a cupcake, cake or treat of your choosing saying: "Happy birthday #NintexConnect! Three years of connection! #Nintex"

 

If you do those things and post the images to prove it below, you will win a present!

No. Not a real present. But you'll win this present:

And it comes with 200 very real points in the community to boost your street cred!

birthdaybadge

Thank you for three years of questions, answers and sharing! May we see many more!

Greetings!! ,

 

Finally back in South Africa from Namibia!

 

And I'm excited to bring you part 3 of my Time sheet blog series. If you guys remember in part 2 , I didn't get to cover the Time Sheet form below:

 

 

And so in this part I've included a breakdown on how the Time sheet Form has been put together as suggested by David Heineman  

 

I've done a +/- 29 minute long video explaining the form in greater detail, and also shared some tips and tricks that should help you on your Nintex journey  

 

 

 

Hope you guys enjoy this 

 

Have a great week everyone!

 

Cheers

AuthorPalesa Sikwane
Long Description

An update on Nintex® Workflow Microsoft® Dynamics Live Actions and Native CRM Actions

Affected Product(s)
  • Nintex® Workflow 2013
  • Nintex® Workflow 2016
  • Nintex® Workflow for Office 365 
Release Notes

Nintex Workflow 2013 - Release Notes 

Nintex Workflow 2016 - Release Notes 

Nintex Workflow for Office 365 - Release Notes 

ScreenshotsNintex® Workflow 2013

 

 

 

 

Nintex® Workflow 2016

 

 

Nintex® Workflow for Office 365 ®

 

 

Greetings everyone,

 

I'm pleased to announce an update on both the Microsoft® Dynamics CRM Connector as well as the Nintex® Live Microsoft® Dynamics CRM Actions  

 

So how do you get this new update?

 

On premise (2013/2016)
Download and install the latest setup files and follow the update guides below:

 

Updating Nintex - What you need to know

Product Update Process

 

Office 365

The update is deployed automatically to our Nintex® Workflow App

 

What's special about this new update?

 

Everything stays the same and works the same, but the key difference been that you can now connect to CRM 2016 on premise as well as CRM 2016 Online !

 

Enjoy!!   


Cheers

Just two months after we let the world know we passed 5 million views of correct answers, the number has jumped up over SIX million!

 

More than six million times, someone has viewed an answered question, so the community continues to speed ahead!

 

That means they may have avoided starting a support ticket. Or they learned something that saved time. Or they passed it on to a colleague and saved that person some time or spared them some frustration.  That's what community is all about!

 

You can see in this chart, the increase in views, over the last third of the community's existence.

 

6million views of answered questions

 

That's due in large part to two things:  

1. When community members remember to click "mark correct" on replies to their questions.  Sure, they could just read replies that pop into their email. But taking the time to return and mark an answer correct is so helpful for other community members.  Read more on that here: Remember To "Mark Correct" 

 

2. The efforts of the Blue Ribbon Group members of this community. They VOLUNTEER their time to help out trhe community by answering a lot of questions and by marking answers that they know are correct. It's a big help to everyone. Thank you, Blue Ribbon Group!

 

 

To learn more about how to get the most out of the community, including using search, finding your way around and earning points, check out the following posts:

 

Get Involved in the Nintex Community

Making the Most of Community Features

Quick Tip: Get the Most From Your Newsfeed

New Look To Connect!  More Bling For Your Blog!

One central function of a community is to provide useful information for others to resolve their own questions.  In Nintex Connect, a lot of people come seeking answers to questions they have about Nintex Products.  To be a helpful community member,  remember to click "mark correct" on replies to your questions when you get them.

 

What difference does it make? 

 

A lot.  Allow me to peel back the curtain a bit...

 

For starters, marking an answer correct helps that questions show up higher in search results.  That's right, a question with an accepted answer gets treated with greater value in the search algorithm in the community and shows up higher in search results.  

Take a look:

search results

 

Not even Google gives you that value.  Here's the same search from Google. Sure, it tells you there's community content, but it doesn't tell you what's the most valuable content!

search results w google only

 

Which is why I ALWAYS recommend that after you land in the community, use the search function for your question and see what results pop up. Chances are someone else has asked/answered a similar question, and you could find the info you seek.

 

And then in a visual cue, marking a correct answer changes the blue question icon to a green checkmark.  That is a quick, easy way to see at a glance that there's a correct answer!  See:

correct answer pic

 

So, even though it's convenient to read an email from the community, see a suggestion to try and then go about your business, please take a moment to click the link to the discussion and mark an answer correct if you find one.

 

Thanks!

For those of us who tried or in the process of evaluating integrating Nintex Workflow Cloud into your solution(s), eventually you will get to the question "Why should I integrate Nintex Workflow Cloud?" especially if you code your solution from scratch using platform such as .NET or Java, writing code will be the default way to support any business logic or process(es) of the solution you are building with such platform. Here is my two cents worth for that question.

 

Before we get into discussing why, let us get few more examples here. Take another solution platform for the same question - Sharepoint, which is considered as COTS (i.e. commercial out of the box solution) for intranet/collaboration portal. We do not expect lot of coding or customization, by default Sharepoint supports document management with its content type of Document Library, supports creating of custom records using Custom List, when come to automating a document or record created in Sharepoint, it is not going to be easy especially we expected it to be COTS, Nintex Workflow is just the right fit to complement the weaknesses making process automation possible on Sharepoint.

 

So now the question is when you create your solution from scratch using platform such as .NET or Java, by default you will code everything yourself and it would be much flexible and powerful to you when come to coding. Similar to if you are using Mendix for instance for your solution, it has what it called Micro-flow to support logic behind defined objects, events, etc. in Mendix solutions, why would I consider to integrate Nintex Workflow Cloud since the platform itself supports business logic?

 

Reasons to integrate Nintex Workflow Cloud to your solution(s)

Take Nintex Workflow Cloud to replace some of the building block of your .NET or Java solution, helps your saving efforts on reinventing the wheels. The building blocks of a custom built solution usually consists of different modules/blocks to handle different function groups, for examples there will be modules such as:

  • Security/Login module
  • User Management
  • UI (Portal Pages, Forms, Views, etc.)
  • Report Management
  • Business Process/Workflows
  • Document Generation/Processing
  • Electronic Signature

e.g: Solution Architecture Diagram of a custom developed solution

 

1. Workflow Module

In a solution architecture of a custom developed application/solution, we often modularized it into different module/design blocks. Some of the benefits of modular design is that modules could be reused in another solution, some of the modules could be puzzled with ready to go solution(s). The above example of a solution architecture diagram illustrates a need to include a Workflow engine, which should be handled by a ready to go solution such as Nintex Workflow Cloud, which makes a lot more sense than having to develop all the workflow functionality and management from scratch.

  

2. Document Generation

The above scenario tells us that we will need to build all these modules, of course if you have done one before you could re-use it, if not you will need to spend tremendous effort to write one. Nintex Workflow Cloud comes with some niche and unique features such as Document Generation. Automated document processes are common in today's business processes, majority of the solution requires the creation of document manually using a word template for instance, these processes could be improved by Document Generation features of Nintex Workflow Cloud. If document creation to be automated, one will find it challenging as there ain't many options of API to do that. It become more challenging if one will need to keep the solution up to date with never ending evolving technology of document API. This is a good scenario and good opportunity to pass the job to Nintex Workflow Cloud. The automation of document generation could be passed to Nintex Workflow Cloud,  once it's done, it saves the generated document to a specified shared drive where the initiation program could pick that up from there.

 

3. System Integration

One of the strength by Nintex Workflow Cloud is the capability and rich set of connectors and actions supporting integration with other systems. Further more, with the compliance to the Swagger standard, it is easy to extend to include connectors to other system that was not already included as part of the default connectors. Capability to integrate with external system is always one of the huge area in a solution, and it usually requires huge effort to build and manage, leveraging that as part of the Nintex Workflow Cloud not just saves the effort to build one, but also provides the flexibility when come to extending of connectors to other systems.

 

Again, these are just my two cents worth for the question by providing just few examples here. I believe there are many more reasons to explore, such as Public Web Form? if your solution is an Intranet based deployment, most of the time you are not going to expose your so called Intranet solution to the public facing internet, leveraging the Nintex Workflow Cloud's Anonymous web form could a a quick solution to the requirements.

 

The concept of modular design with integration of Nintex Workflow Cloud as the required Workflow module is not just saving you the effort to creating one from scratch, it helps when you left the solution to a customer after you done the development project, clears off hassles for your customers to maintain or troubleshoot the workflow module. IT platforms are being patched and updated at a fast pace today, patches and updates introduces huge bandwidth of maintenance. Getting Nintex Workflow Cloud to handle some of these functionality minimizes the risk and needs to continue maintaining the code. 

In Part 1 we explored the new Nintex SharePoint Connectors for our Domestic Animal Registration example and successfully published a Public Web Form and added our responses to SharePoint Online which triggered an internal review and approval process. 

 

Part 2 is all about our backend process to save our users even more time and effort. Remember those paper / email received forms our Chris City Council workers received. Well once they've approved them internally they need to update a central CRM (Salesforce) by adding the record. They either do this by typing directly from that paper form or copy and pasting from the email received PDF. We also need to inform the member of the public that submitted the form and provide them with a Proof of Registration which is a council generated document that we also do manually. So let's get started.

 

I'm going to create a backend process workflow in Nintex Workflow Cloud with the Start event of External start. This is going to allow me to initiate this workflow via my Office 365 internal process. It's a lot easier to connect directly to the number of SaaS application that I'm working with here as opposed to within my Office 365 tenant (however, Nintex does offer Salesforce and Dynamics CRM connectors as part of the Office 365 product). I've also replicated the variables from part 1 so I can reuse the same info throughout the process.

 

I'm going to introduce a little smarts here in case I was processing this on batch using Nintex Workflow Cloud Scheduled Start (for purposes of demos I want this to kick off as an entire process but some organisations might want this to be batch). So I'll make sure that only those items in my list that have gone through the Approval process are picked up for processing using the SharePoint Online Query a list action and because this may be a number of items greater than 1 I'll need a Collection variable declared to iterate through.

 

The next group of actions I need to do are all related to whether an Application has been marked as Approved. I need the process to - 

  • Retrieve application details
  • Generate a unique certificate number
  • Generate a document and store in an electronic file share ( e.g. Box, Dropbox, OneDrive, Google Drive)
  • Create a Salesforce record
  • Contact application via SMS and email certificate
  • Update my SharePoint Online item to Completed

 

For Each 'Approved' application let's retrieve those details from SharePoint Online and Generate a Certificate Number.

 

Now we need to Generate that Certificate. It's my pleasure to introduce you to Nintex Workflow Cloud Document Generation. Platform agnostic cloud Document Generation complete with Nintex Document Tagger capabilities. Essentially the Nintex Document Tagger is opening up all the workflow and item property references that I can merge into my cloud stored document at execute. Really clever stuff. I just copy the variable tag I want and paste it into the template. And the awesome news, Document Generation is now General Availability!

 

OK, unique document generated, simples. Next we need to put this into our structured data repository, Salesforce CRM. Could equally be Dynamics CRM as Nintex have connectors for that. I've created a custom object in Salesforce for my Domestic Animal reg records so we can even find that via our Salesforce connectors. 

 

We want to be able to attach that generated document to an email for our applications records and we need to check back in with SharePoint Online to update the status of the application now that everything is 'Completed'.

 

 

Not long to go. I just need to let me application know that they're animal has been registered with the council and to provide them the certificate of registration. With such a heavy reliance on mobile phones and the instant notification, what we'll do is SMS a note to say the animal has been registered using Twilio, and we'll email the certificate to the applicant.

 

My very last stage is to publish this workflow, grab that External Start URL so I can initiate it from my Office 365 SharePoint Online Nintex Workflow and then we're done.

 

 

Just to recap, parts 1& 2 we've covered a considerable amount of Nintex Workflow Cloud functionality to fully digitize my Chris City Council external application process - 

  • Public Anonymous Forms
  • Create an Item in Office 365 SharePoint Online
  • Initiate a Nintex Workflow Cloud workflow from Office 365 SharePoint Online
  • Referenced that Office 365 SharePoint Online list
  • Generated a document from a cloud stored template with my item data and sent that to our applicant
  • Created a Salesforce record for a custom object
  • Sent SMS and email confirmation
  • Updated an item in an Office 365 SharePoint Online list