Skip navigation
All Places > Getting Started > Nintex For Office 365 > Blog
1 2 3 4 Previous Next

Nintex For Office 365

50 posts

This Quarter's Nintex Workflow Hero is IMS Electronics Recycling.  Later today at this month's Workflow Hero's live webinar, we will showcase their Nintex for O365 success story hosted by yours truly.  See how they were able to adapt to change and growing compliance requirements saving their organization over 5k+ sheets of paper AND 1,256 hours PER YEAR!

 

Be sure to check out their case study right here | IMS Electronics Recycling Nintex Workflow Hero Story

 

Not able to make today's Webinar?  No worries!  You will be able to access the full Nintex Workflow Hero series.  Just visit the Nintex Workflow Hero's landing page for all webinar's past and present and while you are there, register for upcoming live webinars for Nintex Workflow Cloud and Nintex App Studio.  Probably the only binge watching you will do all week that could turn you into a hero!

 

How to become your organizations workflow hero?  | Nintex Workflow Hero's Live Webinar Series

 

I recently had a requirement come up while implementing an On-Boarding Workflow. The requirement from my client was:

 

"we want to have multiple list item attachments for the new employee process but when the workflow starts I need to have different attachments emailed to different employees".

 

I started to think about this because there is not an action within Office 365 to get list item attachments, although there is a web service we can call to get them. I wanted to document the steps I needed to take just in case others needed to do something similar.

 

Step 1) Build Dictionary to set the Request Headers.
Drag and drop Build Dictionary action on to the canvas and Add the following Items: Below is what the action will look like when you are done.

  1. Key: Content-type
    • Type: Text
    • Value: application/json;odata=verbose
  2. Key: Accept
    • Type Text
    • Value: application/json;odata=verbose
  3. Output: RequestHeader of type Dictionary

Step 2) Build URL String
Use the Build String Action to build out your Web Service Call URL and Output to a String Variable (I used varWebServiceURL) . Everything in RED makes this URL Dynamic to pass in the right information.

 

REST Web Service URL to only return information around the list item Attachments

{Workflow Context:Current site URL}/_api/web/lists/getbytitle('{Workflow Context:List Name}')/items({Current Item:ID})/attachmentFiles

 

Break Down:

  1. I use the Get Current Site URL to make my workflow Dynamic instead of hard coding the URL to the Site
  2. /_api/web/lists/getbytitle('LIST NAME')
    • get to the list you want to get attachments from
  3. /items(CURRENT ITEM:ID) 
    • we want to make sure we are working with the right list item so we need to tell the web service what ID we want to use
  4. attachmentFiles
    • This is acting more or less like a filter because I only want properties around the attachments returned from my web service call

In the end this is what your action should look like:

Step 3) Use the Call HTTP Web Service Action

  • Address: varWebServiceURL
  • Request Type: HTTP GET
  • Request Header: Use your Request Header Variable in my case it was called (RequestHeader) type Dictionary
  • Request Content: Leave Blank
  • Response Content: Create a new Variable called ResponseContent of type Dictionary
  • Response Header: Create a new Variable called ResponseHeaders of type Dictionary
  • Response Status Code: Create a new Variable called ResponseCode of type Text

 

Configured Action

Step 4) Next we need to get the Response Content returned from the web service.

 

We will use the Get an Item From a Dictionary Action to get our Response from the web service. Drag and Drop that action and double click to configure.

 

Dictionary: ResponseContent (var used from the action above)

 

Item Name or Path: Because this is getting returned as a JSON object we need to make sure we get to the right part and we want the results path so we need to enter: d/results

 

Output: I created a new variable called tempDictionary to store my results

 

Configured Action

Step 5) We need to get a count of the items so we know how many times we need to loop though the results to get the data we need. Use the Count Items in a Dictionary Action and double click to configure.

  • Dictionary Variable: tempDictionary
  • Output: New Variable ResonseCount of type Integer

Configured Action

 

Step 6) We also need to create a new Variable called LoopCounter of Type Integer and use the Set Variable Action to set it to 0

 

Step 7) Now we need to loop through our returned contents. Use the Loop n Times Action and configure it with the ResponseCount variable you set a couple steps ago so the action knows how many time to loop.

 

Configured Action

Step 8) Now that we are looping we can start to get the individual attachments from the response. We will use the Get an Item from Dictionary Action and configure it as follows.

  • Dictionary: ResponseContent
  • Item Name or Path: d/results({Variable:LoopCounter})/FileName (we use the loop counter as our index to get the file name so then I can build the URL later).
  • Output: varTitle (this is a new string variable to store the attachment title we will use later)

Configured Action

Step 9) You will repeat this step as many times as you need to and You might handle this next step differently depending on your desired outcome, but with our new employees they will always have the same documents uploaded. So I created a string variable for each one. For example: varResumeURL, varEmpAgreementURL, so on and so forth. I did this because as of this post you cannot attach list items to an email.

(Disclaimer: I tried to use the External Email and attachment, but the attachment failed.)

Within my workflow I have an few Run IF Actions and I configure them to look to see if the contents of the varTitle Contains (ignoring case) key words in my file names, for example: Resume, Agreement, etc.
Here is an example configuration of one of my RUN IF Actions

 

If I get a match for what I am looking for I use the Set Workflow Variable Action and build out my URL to the Attachment. Each List item has a unique URL you can use to get to your attachments. I set my varResumeURL to the following URL within the action. The key here in this url is going to the ListName/Attachements/ID/FileName

 

{Workflow Context:Current site URL}/Lists/NewUserRequest/Attachments/{Current Item:ID}/{Variable:varTitle}

The next step is to build out my Send Email Action. I drag/Drop the Send Email Action on and then in the body of the email I create a section called Attachments and then use the Hyperlink Builder to create a link to the document, so I am not really attaching the document which saves on email bloat, I am just linking to it.
Click Insert Link and Configure as follows:

 

That's it. This would be a really good utility workflow as well, because you could pass in the List Name as well.

Here is what the full workflow looks like. If you have questions, post them as I am sure more than 1 person has the same question.

     

When designing a Nintex form, it's common that there is a part of the form that should only display to certain users. These sections are usually contained in their own panels within the form. It could be IT administrative details, financial information, or actions for HR to complete. In Nintex On Premises, it is easy to hide this panel based on the current user by using a rule and the inline function fn-IsMemberOfGroup. Unfortunately, that function is not available yet in Nintex Forms for Office 365.
This function is available in O365 Forms, it just does not appear in the Runtime Function list. If you manually type it in, it will work. Here is a list of all supported inline functions.

 

To accomplish this task, I created a hidden text field called "txt_HiddenIsUserAnAdmin" in the form and assigned it the JavaScript variable named jsvar_HiddenIsUserAnAdmin. I then created a rule on the panel that would hide it whenever txt_HiddenIsUserAnAdmin does not equal "Yes". To obtain the groups that the user belongs to without using the inline function, JavaScript must be used. Luckily, the hard work of that development has already been accomplished by Swetha Sankaran in Part 2 of her post on checking if a user belongs to a group. That took me a long way but I had to do a little more to get it to do what I needed.

 

var pollSP;
NWF.FormFiller.Events.RegisterAfterReady(function () {
    pollSP = setInterval(checkSPLoad, 500);
});

function checkSPLoad() {
    if (clientContext) {
        window.clearInterval(pollSP);
        onSPLoad();
    }
}

function onSPLoad() {
    var spGroups = clientContext.get_web().get_currentUser().get_groups();
    clientContext.load(spGroups);
    clientContext.executeQueryAsync(
        Function.createDelegate(this, function () { OnSuccess(spGroups); }),
        Function.createDelegate(this, this.failed)
    );
}

function OnSuccess(spGroups) {
    try {
        var groupsEnumerator = spGroups.getEnumerator();
        while (groupsEnumerator.moveNext()) {
            var userGroupNames;
            var currentGroup = groupsEnumerator.get_current();
            userGroupNames += currentGroup.get_title() + "\n";
            if (currentGroup.get_title() == "Team Site Owners") {
                NWF$("#" + jsval_HiddenIsUserAnAdmin).val("Yes");
                var triggerEventNow = NWF$("#" + jsval_HiddenIsUserAnAdmin);
                triggerEventNow.trigger("blur");
            }
        }
    } catch (err) { alert(err); }
}

function OnFail() {
    alert("Failed to load groups")
}

 

You'll notice above that I set a hidden text field, jsval_HiddenIsUserAnAdmin, to "Yes". Then I trigger the event which causes the rule on the panel to execute allowing the panel to be seen. Without the trigger, the field will update but the rule will not execute leaving the panel hidden regardless of who is logged in.

 

Another thing to note is that this only works for SharePoint groups, not AD. Unfortunately, that is not supported using JavaScript but you could try querying AD group membership visibility as a workaround.

Question: 

 

I’m trying to make task due dates dependent on the day the task notice is sent, so that if a completed task approval leads next to an additional task being created for someone else, and that 2nd person has 3 days to complete it, I’m expecting to find a way to make the Due Date = (Task Creation Date) + 3. Is there a way to program that?

 

Answer:

 

In the Approval Branch for the initial task, add an “Add Time to Date” action and set it to 3 days with the date set at ‘Use date when action is executed.’ From there you can configure the task due date in the next Task to use the workflow variable you created in the previous action.

 


Good Day To all,

I have been testing the water with nintex 365 and one of the things that I was interested in solve is the workflow history log for audit purposes. And then I have one more solution that could be helpfull. 

 

 

1. I will create the example by using a simple list called (approval) with the following columns: Title(Single line of text), Approver (Person or Group) Status (Waiting for Approval, Approved, Rejected) this is type "Choice" and InstanceID (Single line of text)

 

2. The workflow is pretty straight forward (I will be uploading the workflow here )

 

3. Once I have the list and the workflow working accordingly the next step is add some jquery script to pop up the workflow history log related to a specific item..

 

 

 

 

The script should be added by using the web part Content Editor:

 

Find  the script and workflow attached: (remember change the URL related to your history log)

 

Thanks

Walter

With Nintex workflows, one of the challenges many developers have with Lazy Approval on Office 365 is capturing the comments provided by Approvers in the email back to the original submitter. A task email is sent to an Approver and that Approver has the option to only reply with the approval/rejection keywords. Any other comments in the email will be disregarded by the system. Also, there is no option to CC the original submitter in the initial task email which means the only way for an Approver to email comments back to the original submitter is by manually copying them in the approval/rejection email. This would not allow the comments to be captured along with the task. While comments cannot easily be captured if the Approver replies by email, they can be captured if the Approver customizes the Nintex task form. Simply put, the comments are captured in a workflow variable and added to the body of the email.

 

To accomplish this, start by creating your form and workflow. For this example, I am using the simple workflow below:

The first step is to create the text variable that will capture the comments from the custom Nintex task form. Click "Variables" from the workflow ribbon and then click "New" to create the text variable. I'm calling mine "TaskComments".

After that has been completed, go into your "Assign a task" action and click on the "Edit Task Form" button in the ribbon.

You will be presented with the standard Nintex form that you can customize. Next, create a panel at the bottom of the form and assign a rule to hide the panel at all times. The rule should have these settings:

Add a "Calculated Value" field to the panel. This field will assign the comment to the "TaskComments" variable we created earlier. Add the Named Control called "Comment" to the Formula and set the Connected to field to the variable "TaskComments". The settings should look like this:

Save everything. Your form should now look something like this:

Now all that is left is to add the variable to our email. Go into your "Send an Email" action on the "Rejected" branch and add the "TaskComments" variable to the body of the email. Of course, you can customize it further to include any of the other fields to make it more helpful to the user. It should look something like this:

Do the same with the Approved email and you're done. Publish the workflow and you're good to go!

This technique can be used to include other customizations that are more targeted to your organization. Hopefully this can help you address the needs for greater feedback collection in your approval workflows.

Question:

 

How do you keep lazy approvals functional as we migrate to Microsoft Exchange on Office 365 while maintaining our On-premises servers.   

 

We plan to manage a hybrid environment and our goal is to configure Microsoft Exchange for Office 365 with our SharePoint On-Prem servers.  If we were to move to Microsoft Exchange in Office 365 today, and decommission our Exchange servers, lazy approvals would fail to work.   

 

We would like to maintain continuity between Microsoft Exchange on Office 365 and SharePoint On-Premises.  How can we achieve this? And what are the steps we need to take to accomplish this? This is a high priority issue and your assistance on this will be greatly appreciated.

 

Answer: 

 

You will need to ensure that incoming email is functional for their SharePoint farm.  In order to do so, you will need to configure a connector within O365 to route the messages appropriately to their on-premises environment.

 

The articles below should provide some more information on this.

 

Integrating SharePoint On Premises With BPOS and Exchange Online: Part 2 – Inbound

Configure mail flow using connectors in Office 365

This is open for everyone, and anyone to answer.

 

  1. I want to search for a specific item, and that should bring all possible matching items as dropdown list that a user can select.

 

  1. Based on item, the item name, customer name, customer number all should automatically be displayed.

 

Answer for Questions 1 & 2

For question one / two, you will need to use jQuery to modify the control to incorporate the type-ahead functionality. From there you can use the calculated value controls, using the list lookup runtime function to populate the rest of the data. The links below will provide some context around potential ways to incorporate this functionality.

 

https://community.nintex.com/thread/11807

https://community.nintex.com/thread/2546

 

 

  1. The same needs to work for the requestor's name. Based on requestor's search selection, it should show their title, phone etc.

 

Answer to Question 3

The userprofile runtime function can be used (currently only available in on-prem) to populate the data using calculated value controls in a similar method described above with lookups.  If they are on O365, this functionality is on the roadmap, but has not been built at this stage.  

 

  1. I would like also like to enhance our Leave Request form. Our IT Department calendar, we update frequently, so we can  keep the department up to date on whose working remotely, travelling, on PTO, sick, working shift, attending conferences etc.

 

Answer to Question 4

How to Create a Leave Request Workflow in 365

Task Escalation Comes to Office 365

Remember those days when you have an automated business process and it sits there waiting for someone to do something or approve a task and they just won't?  Those days are over.  Task Escalation has made it to Nintex Workflow for Office 365.

How to configure, escalate, and auto-complete options in 365

 

Adding Vacation or Sick Time Hours

Answering it depends on how you are capturing their hours.  If they are creating a new item for each type of time such as vacation time, sick time etc. then you would need to query the list, grab all the items and calculate those.

 

If they are creating one item and then inputting multiple hours such as one would with a time sheet, then a calculated column or two should handle this. 

 

The goal is to know the structure of the data.  If linear, meaning contained in one item, you can use a calculated column or workflow running on that item to do the trick. If non-linear, meaning the data is contained in multiple calendar items and different columns, then a list/site workflow would need to run to do this.

 

Booking a computer in a training room

The training will happen at one of our training centres at 2 of the training rooms. They are Training room 3 and 4 and there are 16 computers per training room.  There are also 2 sessions for each day with the possibility of adding a third session.   They want HR to be able to book the people for these sessions per computer but they should also not be able to delete each other's bookings.   They want to see this in a calendar view and he wanted the solution later the afternoon...(+- 3 hours later)

 

All of the below are great resources for more information on building Nintex workflows and forms.:

 

http://help.nintex.com

http://community.nintex.com

https://learning.nintex.com

If you have occasion to hide the Save button along the top of a Nintex Form in Office 365, below you will find some simple CSS that you can copy and paste into your Form Settings -> Custom CSS section to do so.

 

Before:

 

 

 

The below CSS will remove the button labels and the Save icon:

#RibbonSaveButton
{
display: none;
}

 

After:

Retrieve User Profile Details on Nintex Form for Office 365

Unlike the Nintex on-premise version , the User Profile Lookup function isn’t available yet in the Office 365 Nintex forms [Hopefully Nintex will add it in the future versions]. So here is a solution on how you can fetch the user profile properties using a custom JavaScript code.

 

Please refer my blog-post Retrieve User Profile Details on Nintex Form for Office 365 [Step – By – Step] 

to view the step-by-step process.

 

 

Common Code (Add this code inside the Site Assets folder and name it “sp.userprofile.js“)

—————————————————————————————————————————————–

function SPUserProfile(strAccountName, OnComplete, OnError) {
var currentObject = this;
var userprofile = null;

this.getProperty = function (propertyName) {
if (this.userprofile == null) {
propertyValue = “Userprofile not initialized or error fetching user profile”;
} else {
var propertyValue = NWF$.grep(this.userprofile.d.UserProfileProperties.results, function (k) {
return k.Key == propertyName;
});
if ((propertyValue == null) || (propertyValue.length == 0)) {
propertyValue = null;
}
}
return propertyValue;
}

this.getDisplayName = function () {
return this.userprofile.d.DisplayName;
}

this.getDepartment = function () {
var propVal = this.getProperty(“Department”);
if (propVal == null) {
return ”;
} else {
return propVal[0].Value;
}
}

this.getMobilePhone = function () {
var propVal = this.getProperty(“MobilePhone”);
if (propVal == null) {
return ”;
} else {
return propVal[0].Value;
}
}

this.getTitle = function () {
return this.userprofile.d.Title;
}

this.getEmail = function () {
return this.userprofile.d.Email;
}

function execCrossDomainRequest() {

if ((SP == null) || (SP.RequestExecutor == null)) {
setTimeout(execCrossDomainRequest, 500);
} else {
var url = appweburl + “/_api/SP.UserProfiles.PeopleManager/GetPropertiesFor(@v)?@v='” + encodeURIComponent(strAccountName) + “‘”;
var requestHeaders = {
“Accept”: “application/json;odata=verbose”
};
var executor = new SP.RequestExecutor(appweburl);
executor.executeAsync({
url: url,
contentType: “application/json;odata=verbose”,
method: “GET”,
headers: requestHeaders,
success: function (data) {
currentObject.userprofile = JSON.parse(data.body);
OnComplete(currentObject);
},
fail: function (error) {
userprofile = null;
OnError(error);
},
});
}
}
var appweburl = decodeURIComponent(getQueryStringParameter(“SPAppWebUrl”));
NWF.FormFiller.Events.RegisterAfterReady(function () {
execCrossDomainRequest();
});
}

—————————————————————————————————————————————–

  • Inside the custom JavaScript paste the below code

—————————————————————————————————————————————–

var hosturl = decodeURIComponent(getQueryStringParameter(“SPHostUrl”));
NWF$.getScript(hosturl + “/SiteAssets/sp.userprofile.js”).then(function () {
NWF$(document).ready(function () {
NWF$(‘#’ + EmployeeName).change(function () {
OnEmployeeChange();
});
OnEmployeeChange();
});
});

function OnEmployeeChange() {
var selectedEmployee = NWF$(‘#’ + EmployeeName).val();
selectedEmployee = selectedEmployee.split(‘;’);
if (selectedEmployee[0] == “”) {
NWF$(“#” + JobTitle).val(”);
NWF$(“#” + Department).val(”);
return;
}
var userProfile = new SPUserProfile(selectedEmployee[0],
function (data) {
NWF$(“#” + JobTitle).val(data.getTitle());
NWF$(“#” + Department).val(data.getDepartment());
},

function (error) {
NWF$(“#” + JobTitle).val(”);
NWF$(“#” + Department).val(”);
alert(‘An error occurred while fetching userprofile data’);
});
}

—————————————————————————————————————————————–

Thanks !!

In Nintex 2010, 2013 and 2016 for SharePoint (Standard version even) on-premise of course, there was a possibility to use excel services to query and work with the xlsx and xls files' data. However, insharepoint online there is no such powerful mechanism. Moreover Nintex products for SharePoint Online (neither nwc nor nintex for office 365) don't have any "ootb" actions that would fill that gap. So in the end, there is no straightforward way to achieve it.

 

The most common workaround is to convert the XLSX file into a plain, csv file and then to work with the data from the file using collections (I will write about it in second post).

 

Recently I have realized, that there is a set of excel actions in microsoft flow! All of us, who has SharePoint Online, has also a free version of Flow available. 

Just be aware, that in the free version Flow does not triggers itself once an event occurs. It just repeatedly checks whether an action occurs and in case it's "check" does not happen in the moment, when an event occurs, it might never get triggered. 

For the paid version there is no such risk as it works somehow like the Remote Event Receiver.

Anyway, I decided to give it a try.

 

Flow's boundaries in Excel actions

First things first. I want to let you know what are the boundaries of this solution. Flow is not that flexible in this scope as I thought:

  1. To anyhow work with the Excel files' data, the data must be put in a table. So once you have a data set in the file, you must convert it into table and name it (Rename an Excel table - Office SupportExcel Tutorial: How to Name Excel Tables For Beginners Excel 2016 Tutorial Excel 2013 Tutorial - YouTube):
    1. Select your data set;
    2. From "Tools" choose "Format as a table";
    3. Then go to "Design" tab;
    4. Set your table's name in the left top corner.

  2. If you decide to put a variable in the "File name" configuration field of the action, Flow will not allow you automatically to get table's name and to use its columns later, as variables. The same thing happens if you decide to set "Table name" using a variable, not as a direct string:

  3. If you choose, that data from the Excel will be uploaded to SharePoint List, there you again cannot use variables, to set list dynamically, because in that case you will not be able to bind list columns with Excel table columns:

However, you should still be able to dynamically choose your list using "HTTP Request" action, to get list of its columns, and then another, to insert data.

Step-by-step solution

After you accept all the above boundaries and go into rather "fixed" Flow workflow (still, you can create a Flow per each list/ Excel file, etc...), the working solution is built of the following components:

 

  1. SharePoint Import Library - this is where the Nintex workflow operates. In my case it takes the uploaded file, then uploads it into onedrive for business specific folder under a specific name (the one set as a source for the Flow). After it uploads the file, it then calls the Flow workflow using the "Web Request" action:

    1. "Authorizing user" - parameter visible in the "One Drive upload file" - it expects a valid email address to which an email with the authorization request will be sent. The email looks like this:

      Once user clicks "Provide OneDrive account credentials and authorize access" and then will let the app to access account info (in the next screen)
      the workflow will move on.

      According to the documentation (source: OneDrive upload file) the workflow will wait up to 7 days for the user's decision.

    2. "Body" - the parameter in the "Web Request" action needs to be a valid json string (Flow is expecting a JSON request body). As there is a known bug, that prevents you from straightforward JSON declaration (the opening and ending brackets are somehow omitted), you need to do it the workaround way: Declare a variable, where opening and ending bracket are some specific tokens --> Regex, replace these tokens with { and } accordingly.
  2. SharePoint List - the one, where the data from the Excel is going to be imported. I added 3 columns - text and date.
  3. OneDrive for Business - there must be a specific folder, that will be used to save a file and then be queried by the Flow.
  4. Microsoft Flow workflow - the one that takes the Excel file, pulls out the data, and then for each row does the insert into the list from point no. 2:

 

Working example

  • I have created a simple Excel file:

 

  • I have uploaded it into the "Import Library"

If you don't plan to make an extra approval or other extra logic before the file gets queried and data gets inserted into list, you can simply just upload the file to OneDrive straight away, and then change the Flow to gets triggered once a new file is uploaded.

  • That action triggered my Nintex workflow. I then received an email with the request to authorize access to my OD4B. Once I did that, I noticed the file gets uploaded:
  • Then I opened status page of the Flow, and was observing how the rows are getting queried and then uploaded to SharePoint:

Note, that for a simple file, having three columns and 10 rows, it took almost a minute to complete the query and import. For larger files this action can really run for hours 

  • And voilla! List is filled with data:

Next steps

It all depends on your specific requirements. In fact now you can trigger a workflow on the list where data got imported, so that each row will request an approval for example. 

 

In a second post I will show you how to import data from the Excel file, when the file is saved as CSV.

 

Thanks for reading!

 

 

Regards,

Tomasz

I’ve seen it quite often recently – questions about getting data from User Profiles using office 365 query user profileaction. In the end it is quite easy, but you must be familiar with some prerequisites

 

SharePoint Online User Profile properties

Basically, there are two approaches:

  1. If you have an on-premise #activedirectory or an AD set in azure, then you can synchronize properties from that locations into azure active directory (AAD) using Azure AD Sync (AAD Sync – it replaces DirSync).
  2. If you are working in the Office 365 tenant exclusively, the properties can be managed and their value set using https://[tenant_name]-admin.sharepoint.com/_layouts/15/tenantprofileadmin/MgrProperty.aspx?ProfileType=User SharePoint administration page.

 

For the first approach adding new attributes into SharePoint Online is… quite hard. Although using AAD Connect tool you can synchronize your custom properties with AAD, but adding them to SPO User Profile is not that straightforward. The approach is described for example here: http://www.ericskaggs.net/blog/synchronizing-custom-active-directory-attributes-to-custom-user-profile-properties-in-sharepoint-online and as well I found that this can be somehow done using the Office Graph API 2.0, ex.: https://worktogether.tech/2016/07/31/extension-attributes-in-azure-ad/.

 

The second approach is far a lot easier and I will focus on it. You just need to go to the link in SPO admin center and from there to the “User Profiles” --> “Manage User Properties” page – you will notice a list of currently used attributes. Click the “New Property”:

The form for a new property creation will be displayed. The most important setting here, to be sure that the new property is available for the office 365 query user profile action, is to set the “Default Privacy Setting” to “Everyone”:

If you scroll down the form, you will notice a set of fields, some of you might be familiar from the on-premises SharePoint: the mapping. This is not supported so far in SPO, however the interface for Online has “daemons” left from on-premise version.

 

Once you add the new property, go back to “User profiles” homepage and then to “Manage User Profiles” page (https://[tenant_name]-admin.sharepoint.com/_layouts/15/tenantprofileadmin/ProfMngr.aspx?ConsoleView=Active&ProfileType=User). For test purposes find your profile and then “Edit My Profile”:

At the end of the form you should notice your custom field. Set it's value and then “Save and Close” the form:

Do the same for any other profile and remember it's account name.

 

Checking if the value is available

Before going to the Workflow, check if the value is already available for Everyone in the profile. Execute the following URLs (UserProfile REST API: https://msdn.microsoft.com/en-us/library/office/dn790354.aspx) in your browser (Chrome preferably) and check if the new attribute is present:

  1. To get your profile: https://[tenant_name].sharepoint.com/_api/sp.userprofiles.peoplemanager/getmyproperties
  2. To get the other profile: https://[tenant_name].sharepoint.com/_api/sp.userprofiles.peoplemanager/getpropertiesfor(@v)?@v='i%3A0%23.f%7Cmembership%7C[login]%40[tenant_name].onmicrosoft.com'

 

At the end of the listed XML data, you will notice your custom property, along with its type and value:

<d:element m:type="SP.KeyValue">
  <d:Key>MyCustomProperty</d:Key>
  <d:Value>Another Test Value</d:Value>
  <d:ValueType>Edm.String</d:ValueType>
</d:element>

Obtaining property’s value using a workflow

Open the Nintex Workflow Designer. In the workflow add the action “Office 365 Query User Profile”. Configure it as following:

  1. SharePoint Online URL – the “root” URL to your tenant.
  2. Username and Password – credentials of the account on which behalf a query of the profile will be made. Particularly it can be any account having at least “Read” access to the tenant in that case, as the “User Profile” data is available to read for everyone.
  3. Property – use the same “Name” as you defined for your property.
  4. Store property in – use the same datatype as the property have.

 

If you log the values, you will notice, that the access to the value of the property is possible:

How to access other properties?

If you have access to the SharePoint admin center, then it is quite easy. Just go to the “User Profiles” --> “Manage User Properties” page and then open the property which value you’d like to obtain in your workflow. Check it’s “Name” and double-check if it has “Default Privacy Setting” to “Everyone”. Notice, that for some predefined properties it is impossible to change this setting.

 

On the other hand, if you don’t have an access to the SharePoint admin center, the easiest way is to open the URL https://[tenant_name].sharepoint.com/_api/sp.userprofiles.peoplemanager/getmyproperties and check property names displayed there. To be even more sure, that the property you are looking for is available for “Everyone”, use the second URL to opens other user’s profile and check properties there  

 

What about on-premise?

Frankly speaking it’s even easier. Possibility to map custom attributes from AD DS is simple – you just need to open SharePoint Central Administration and from there open “User Profile Service Application”, then “Manage User Profiles” and when creating new property – map it with the Active Directory attribute. After that, trigger “Full Profile Synchronization”, so that the new property will get populated with the data from your local AD (https://technet.microsoft.com/en-us/library/jj219646.aspx).

 

After doing that, you can simply use “Query User Profile” (http://help.nintex.com/en-us/nintex2016/current/sp2016/Workflow/ActionsCore/QueryUserProfile.htm) action that will allow you to use a drop-down field, to select required property.

 

Good luck with Nintexing!

 

Regards,

Tomasz

 

 

Calling all Workflow Automation Innovators! 

 

The Nintex Customer Evidence Program would like to learn how you use and drive results with Nintex for Office 365. 

 

We're excited to produce compelling case studies, blog posts, and customer interviews and other cool collateral that amplifies how you're innovating digital business. 

 

If you would like to share or discuss your Nintex for Office 365 use case, please  email farah.ahmed@nintex.com and share your responses to the below three questions:

 

  1. How do you use Nintex for Office 365
  2. How does Nintex for Office 365 help you reach your business goals and success measurements? 
  3. What capabilities of Nintex for Office 365 are most impactful? 

 

Thank you so much for your time and support!

 

Cheers!

 

The Nintex Customer Evidence Program 

While the last post showed how to get an overview of all the sites, where the Nintex Workflow for Office365 App has been activated. The next question is: what workflows exist in the give site, and which of these workflows have been created using Nintex?

 

And again PowerShell is going to help out!

 

Add-Type -Path "d:\Microsoft.SharePoint.Client.WorkflowServices.dll"
$clientId = "client_id=" + [System.Web.HttpUtility]::UrlEncode($appPrincipalId)

$wfmgr = New-Object Microsoft.SharePoint.Client.WorkflowServices.WorkflowServicesManager($ctx, $web)
$wfdpl = $wfmgr.GetWorkflowDeploymentService()
$wfdefs = $wfdpl.EnumerateDefinitions($false)

$ctx.Load($wfdefs)
$ctx.Load($web)
$ctx.ExecuteQuery()

$webUrl = $web.Url
$items = @()
$wfdefs | % {
    $lauchUrl = $webUrl + "/_layouts/15/appredirect.aspx?"
    $lauchUrl += $clientId
    $lauchUrl += "&redirect_uri=" + [System.Web.HttpUtility]::UrlEncode("https://workflowo365.nintex.com/Hub.aspx?{StandardTokens}&ListId={" + $_.RestrictToScope + "}&AppVersion=1.0.3.0")

    $item = @{
        Name = $_.DisplayName
        Region = $_.Properties["NWConfig.Region"]
        Designer = $_.Properties["NWConfig.Designer"]
        Entitlement = $_.Properties["NWConfig.WorkflowEntitlementType"]
        Author = $_.Properties["AppAuthor"]
        LastModified = $_.Properties["ModifiedBy"]
        LastEditor = $_.Properties["SMLastModifiedDate"]
        Type = $_.RestrictToType
        ScopeId = $_.RestrictToScope
        Published = $_.Published

        Link = $lauchUrl
    }
    $items += $item
}
$items | Export-Csv -Path "workflow_inventory.csv" -NoTypeInformation -Delimiter ";" -Append

 

At first we need to include another DLL from the CSOM-package, in order to access workflow-definitions. Let's habe a closer look at the PowerShell script:

  1. <>In the previous post we already got the AppPrincipleId using the Execute-NintexAddinFinder function. We now need this Id in line 2.
  2. <>Next we can retrieve all workflow-definitions using the WorkflowServicesManager
  3. <>When we look at these definitions, we mostly care about definitions that have custom attributes that have been added by Nintex. For example one of those attributes includes the region to which the workflow has been published. But we can also see the version of the workflow-designer being used.
  4. <>Finally we can construct a URL, which will directly take us to the workflow-gallery of the list associated with the workflow.

This information can now be written to a CSV file, which can easily be opened using Excel. Excel can help further filtering the list, for example to narrow down to the workflows that have actually been created using Nintex (for e.g. workflows that have a designer-version). To re-publish these workflows just click to corresponding link.

 

Final thoughts

Thanks to PowerShell it only took a couple of minutes to go through my 250 site-collections and as a result I got a list of all my workflows. I still needed to re-publish my workflows manually, but at least I got direct-links to the corresponding workflow-galleries!

 

UPDATE (2017-04-26):

You can find the complete script in my GitHub-Repo.

Recently I've been involved in that project, where data and information that was being created for years in on premise SharePoints had to be migrated to the sharepoint online environment in office 365. All migration was said to be a simple, straightforward and easy due to the usage of sharegate, but... the real truth turned out to be way more dark.

 

Before the real migration started I sat down and started reading about the process and possible obstacles. I can now divide them into 3 groups:

  1. Limitations of Sharegate
  2. Limitations of SharePoint Online
  3. Limitations of Nintex for Office 365.

 

First things first. I will guide you through all and each from them.

 

Limitations of Sharegate

Sharegate is a really great tool. Honestly. It saved me dozens of hours but as well made me frustrated. But still it is great. It really allows not straightforward migrations to be possible (like from SP2007 directly to SP2013 and so on), is the only tool that allows automatic migration from on premise to online, allows migration of sites, users, managed metadata and many, many more.

Currently Sharegate supports migration of 30 actions that are present in on premise (source: Nintex FAQ – Sharegate):

 

Action Name OnPremisesMapped Action in Office 365
Assign Flexi taskStart a Task Process
Calculate dateAdd Time to Date
Change stateChange State
Check in itemCheck In Item
Check out itemCheck Out Item
Convert ValueConvert Value
Create itemCreate List Item
Delete itemDelete Item WM Action
Discard check outDiscard Check Out Item
End workflowTerminate Current Workflow
FilterFilter
Log in history listLog to History List
LoopLoop with Condition
Math operationDo Calculation
Pause for...Pause For Duration
Pause until...Pause Until
Query ListQuery List
Regular expressionRegular Expression
Run ifRun If
Run parallel actionsParallel Block
Send notificationSend an email
Set a conditionConditional Branch
Set field valueSet field in current item
Set variableSet Workflow Variable
Set workflow statusSet Workflow Status
State machineState Machine
SwitchSwitch
Twitter TweetTwitter Tweet
Update itemUpdate List Item
Yammer MessageYammer Message 

 

To use Sharegate you have to really review each from your workflows and re-build it, so that you can expect that it will be migrated. Note, that:

  1. there are some crucial actions, that are totally unsupported but can be "workarouned".
  2. there are actions that just cannot be migrated and must be re-created after the migration.

 

What to do with them? I recommend the following approach:

 

BEFORE MIGRATION

  1. user defined actions - During migration they will be replaced with a blank placeholder leaving you with nothing. All actions from inside will be lost. I recommend converting them into a "Run If" action block (which is "migrate-able") and putting all actions from UDA inside it. Then if I needed to reuse those actions replacing UDAs. Oh, and the rule to execute it was always set to "true".
  2. query list must have a defined number of rows to be returned. Field cannot be left empty. Oh, and by default querying list is always "recursive" in O365
  3. action set is not supported as UDA. See point no. 1 how to resolve.
  4. for each loop actions are as well not supported as UDA and Action Sets. See point no. 1 how to resolve;
  5. workflow constants are not supported in Nintex for O365, thus Sharegate cannot migrate them. You should either remove them, turn them into workflow variables or create a SP List to keep them (so if you are using them in many workflows) and then just query the list.
  6. workflow context variables - some are not supported in Nintex for O365, thus Sharegate cannot migrate them (like the one with Approval Comments). You should either replace them with a workflow variable, or remove because may cause troubles during migration;
  7. inline functions - are not supported in Nintex for O365, thus Sharegate cannot migrate them. You should either replace them with a workflow variable, or remove because may cause troubles during migration;
  8. commit pending changes - this action is not present in O365 as the architecture is different. Actions are not grouped and executed with batches. Everything goes synchronously. All such actions should be deleted before.

 

AFTER MIGRATION

On premise actionOffice 365 action counterpartRemarks
Query/ Update XMLQuery/ Update XML

Actions are not supported because of the architecture differences. Sharegate will migrate them as a blank placeholders.

User Profile actionsOffice 365 User Profile actionsSharegate will migrate them as a blank placeholders.
flexi taskStart a task process/ Assign a task (based on how many approvers you have)No "To Do" task either. You cannot set mail priority, you cannot chose to which from "assigned" user the notification will be sent, you cannot delegate (workaround: https://community.nintex.com/community/build-your-own/blog/2015/02/26/delegation-in-o365aA), you cannot calculate time to send reminder. Oh, and you cannot attach files to the notification message. Some workarounds are possible after the migration, some are not available at all.
search queryOffice 365 Search QuerySharegate will migrate it as a blank placeholder.
dynamics crm"Dynamics CRM" actionsSharegate will migrate it as a blank placeholder.
convert document action"Document Generation"Sharegate will migrate it as a blank placeholder.
set item permissions Office 365 update item permissionsSharegate will migrate it as a blank placeholder.
Update documentNo direct counterpartSharegate will migrate it as a blank placeholder.
Create item in another siteOffice 365 Create List Item or Document SetSharegate will migrate it as a blank placeholder.
Update/ Delete multiple itemsPossibly by some HTTP Request callSharegate will migrate it as a blank placeholder.
Build stringBuild stringAction is not supported for migration because of the architecture differences. Sharegate will migrate them as a blank placeholders.
Collection operationBy a proper Collection operationActions differs. There is a dedicated "Collection operation" per each operation, not one for all as in on premise. Sharegate will migrate them as a blank placeholders.
Store/ Retrieve dataNo direct counterpartThere is no possibility for the workflows in O365 to "talk" with each other. Consider change of the workflow design.
Start workflowStart workflowAction is not supported for migration because of the architecture differences. Sharegate will migrate them as a blank placeholders.
Create site collectionBy a proper Office 365 create site collection
Delete site By a proper Office 365 delete site 
Delegate workflow taskBy Assign a task and custom workFor example: https://community.nintex.com/community/build-your-own/blog/2015/02/26/delegation-in-o365 
Request approvalOffice 365 set approval status Action is not supported for migration because of the architecture differences. Sharegate will migrate them as a blank placeholders.
And many, many more... Read the attachment

 

Complete list of "non-migratable" actions is listed here: Nintex FAQ – Sharegate.

 

Limitations of SharePoint Online

One and the most irritating limitations are the thresholds (Software boundaries and limits for SharePoint 2013). And in Online you CANNOT CHANGE THEM  Be aware of them and don't get angry when migrating:

  1. Workflow file cannot exceed 5MB - (source: https://community.nintex.com/message/57706-re-what-is-the-default-limitation-in-nintex-for-exporting-a-workflow?commentI…) that simply means, it shouldn't have more than 100 actions inside.
    HOWEVER I had a "pleasure" to migrate workflows having like 5 actions without a success and so far I have no idea why Sharegate was reporting errors when migrating.
  2. Infopath file cannot be larger than 5MB (be aware if you have forms with attachments).
  3. List/ library cannot hold more than 5.000 items. If you do exceed it during the migration, be prepared to observe abnormal behavior of your tenant.

 

Limitations of Nintex for Office 365

Those limits have already been described during in the "Sharegate limitations" - those limits are most often caused by the differences in the SharePoint Online hosted environment limitations, thus some actions were just impossible to be re-created and had to be done from a scratch, some has different architecture and cannot be directly migrated. And still, there is a huge lack of functionality in Nintex for Office 365:

 

  1. There is no conditional startfor workflows, however it can be replaced with filter action
  2. There are no site scheduled workflows, but you can workaround it by either using 3rd party: Plumsail Workflow Scheduler or simply work them around: Scheduled Workflows in Office 365;
  3. When comparing you are more than often not able to verify if a value "is empty". You can only compare whether it equals or not (but you cannot set equation to nothing );
  4. No "workflow constants".
  5. No "inline functions".
  6. No "UDA" actions.
  7. No action sets.
  8. No support for custom actions.

 

Summary

I just felt that this post cannot end up like this. So there is a hope for all of that. Both Sharegate and Nintex are working to develop their products - Sharegate to support migration for new actions (I heard UDAs and Action Sets are on their roadmap), what is Nintex working on you can find here: 3 - Nintex Workflow for Office 365: Hot (300 ideas) – Customer Feedback for Nintex.

 

I wish you all all best in your migration projects

If you have any questions or stories you'd like to share feel free to leave them in comments. I'm sure there are more issues I am not aware of yet  

 

Regards,

Tomasz