Last week I made a workflow mistake--not my first. Actually, part of this I will attribute to a confusing setting in Nintex; the other part is purely mine. The setting involved a task field "% Complete", I used it like I did in SharePoint. I was looking for all tasks NOT Complet; which I interpreted to mean less than 100. Nintex wasn't viewing it that way, it was seeing %complete as a fraction of 1--1 being completed--any decimal value would be not complete. So for the first part of the error, I extracted the complete with the incomplete.
The second part of the error was strictly bad programming, I was in a loop and forgot to initialize a variable that I was holding text and text got copied where it shouldn't have been--mea culpa!
I had thought to template the list before I started my operation so that I could and did restore the data.
The question is how many times do you use a test environment?
We have one--but it is never set up the way we need to test. Most times I execute Workflows on production---sometimes on complex workflows I will use a test list on production.
I'm betting that the Production environment is used 90% of the time. Since Nintex is a tool for the average SharePoint user who may not even know how to get to a test environment.
this is a very interesting question. Personally I think there is no final answer to this. I think that the effort you need to put into testing in a testing environment is heavily dependent on risk and impact of your workflow and the data its working with. There are lots of simple task that make absolutely no sense to build them in a testing environment like basic approvals, notifications, easy decision workflows and so on.
But when it comes to sensitive data, loops, external data or data that others are depending on I always prefer to first build the scenario on your staging / testing environment. Lot of our customers share your point, that testing environments rarely reflect the production environment and it's hard to build valid test parcours there. I think thats the first thing to be changed. You wouldn't learn to drive in a bobby car. You need a real car to know what you're doing. Testing only makes sense when you have a testing environment that holds as much of the production data as possible and gives you the same infrastructure to work in.
It's not that difficult to achieve and I'd suggest you talk with whoever is responsive to fix this.
I have never had the luxury of a Test Environment. However I do typically setup a sub site as a testing space for workflows. I also will build in Testing steps within a workflow so that I can test a production workflow without causing any issues.
ah, the age old developing and deploying in SharePoint discussion.
Like many, testing environments are only "strictly" worthwhile if they are kept in line with the Live environment.
I've always taken the approach, that if I'm developing a new application, I will develop that app in the Live environment, because, well, nobody is using it till the app goes live. Once I have got the app to a point where I'd happily call it 1.0 then I'd take the site collection back it up and restore it to my development environment.
With regards to testing, it's terrible practice to test your own work anyway, I'd be interested to know how many people get a dedicated test resource. I'm quite lucky in that I have a guy in my current place who's not a tester at all, but very methodical and thorough and has a good go at breaking everything I do before I send it on to UAT.
I guess it just depends on who's available and how extensive the change in determining how much testing it gets.
I agree I'm the worst person to test my code.
I'm writing a book on d3 and SharePoint and just found out today something I thought was working on SharePoint is not.This had escaped me for two months.
I've converted this to a discussion, as I don't think there is any real "one true answer" to this!
For myself.. I use a dev environment when working for clients on prem. They do UAT in the dev environment, so we don't push everything to a second area.
For internal - o365 - no dev enviro there. I test on a dummy lists and dummy AD groups and so forth.
Like you, it does depend on the client I'm working with at the time. Currently I have a standalone dev machine, and a live environment. Not ideal, but they're in the process of freeing up resource to give me a preprod environment.
My MSDN subscription is vital though as this gives me an O365 to develop on and Azure credits to host my SharePoint farm on for my own development. This is great for showcasing products to potential clients without them having to install anything on their infrastructure.
Now...if only Nintex would provide some sort of Consultant license, for effectively, selling their products to potential clients!
Hey Frank Field, didn't see you there...by the way, whilst you're here, any news on this?
many people here have suggested the right approach...use test environment for complex workflows and only then move to prod.
unfortunately my experience with this approach is very frustrating for reasons mentioned below:
1) Nintex workflow export / import is a good feature, but most of the time i see workflow actions like "Query List" loosing settings. We need to reconfigure the action once a workflow is imported to production environment.
2) Nintex has issues with double byte characters. If a Nintex form (list based or within workflow) has any double byte labels, the text is garbled after import ! (I work with a lot of Japanese text).
So i directly work on production for new workflow and sites and keep my fingers crossed
if we already had a UAT on test environment then after importing the workflow to Prod i make sure at least one of my teammates reviews the entire workflow action by action ! (Yes painful...but better than losing data)
I suspected I'm not alone in testing on production for a number of reasons. Like does the test environment really mimic production.
I've experienced import errors when the workflow loses its configuration settings with a list name change--but generally it helps to have the structure of the workflow in place--even if I have to configure certain item.
I do imports with two windows. In one window I have the original workflow open so I can look at settings if they get screwed in an individual item.
I created a subsite/duplicate lists for testing my workflows. I actually have "test-only" actions that I can disable or enable as I need to (most of these are setting variables for people who are getting tasks later on in the flow). This allows me to set all the critical variables to my own email address so when I execute the flow, I get all the email tasks as if I were the actual recipient.