tposzytek

Nintex Forms – load SQL Server table contents into repeating section

Blog Post created by tposzytek Champion on Oct 8, 2018

This post is a “how to” tutorial, showing how  can a data from SQL Server table be loaded into a “Repeating section” control in Nintex Forms. Needless to say, having that data inside the control allows then to manipulate it and treat each row as a single record, thanks to the possibility of saving contents of the control as XML. 

The solution I created was a bit more advanced than just importing data from SQL into repeating section, as it was allowing to make bulk edit/ delete operations on the loaded data. This tutorial however shows just the loading part.

General assumptions

The solution works in the following way:

  1. The form has a “SQL request” control, that on form load gets all the data from a specific table.
  2. Data is returned in XML format.
  3. Inside a form there is a repeating section with several fields defined.
  4. The script, once triggered, parses the XML from the “SQL request” control and for each row it populates data to defined fields inside the repeating section.
  5. After a row is done, it simply adds a new row to the section and continues its work until all data is processed.

How to step by step

First, prepare your repeating section and fill it with fields of your choice. They should correspond to the type of columns inside your SQL Server table, eg. datetime to date control, varchar(50) to text control and unlimited varchars to multiline text control. You can as well place some labels if you’d like to show some data as read-only.

Repeating section design

 

Second, define and set “Control CSS class” for those fields. They are very important for the script, to inject text into proper fields.

Control CSS class

 

Note! You cannot use JavaScript variable to store controls IDs, as this is a repeating section and therefore the variable storing ID will only keep the last generated one.

Third, add and configure “SQL request” control. I am using here the same approach as for https://poszytek.eu/nintex-en/nf2016-en/show-sql-server-table-inside-nintex-form/ – using the “FOR XML PATH(”)” statement, thanks to which data from SQL Server is returned as a single row and column, in XML format.

 

The query I have in my case is as following:

SELECT TOP(1) (
SELECT column1, column2, column3, column4, column5, ...
FROM myTable
ORDER BY column1 FOR XML PATH('record'))
as datatable FROM myTable

The results are as following:

<record><column1>AAAAA</column1><column2>2018-10-01T00:00:00</column2><column3>ZZZZZZZ</column3><column4>Some other information 1</column4><column5>AAAA</column5>...</record><record><column1>AAAAA</column1><column2>2018-10-01T00:00:00</column2><column3>ZZZZZZZ</column3><column4>Some other information 1</column4><column5>AAAA</column5>...</record>...

And lastly – the script. It is written in jQuery and its main purpose is to parse the XML and to load data into fields inside repeating section.

The script

Once the script is triggered, first it parses the XML IF the data is not yet loaded (the XML is not yet parsed):

function buildRepeatingSection (xmlData, loadedRows) {
  loadedRows = loadedRows || loadedRowsNo;
  var date1 = new Date ();
  var date2 = new Date ();

  var xmlData = xmlData.match (/<record>(.*?)<\/record>/g);
  var parsedData = new Array ();

  NWF$.each (xmlData, function (i, val) {
    var xmlRowData = NWF$.parseXML (val);
    NWF$xmlRowData = NWF$ (xmlRowData);
    NWF$column1 = NWF$xmlRowData.find ('column1');
    NWF$column2 = NWF$xmlRowData.find ('column2');
    NWF$column3 = NWF$xmlRowData.find ('column3');
    NWF$column4 = NWF$xmlRowData.find ('column4');
    NWF$column5 = NWF$xmlRowData.find ('column5');

    parsedData[i] = new Array (
      sectionClass,
      NWF$column1.text (),
      NWF$column2.text (),
      NWF$column3.text (),
      NWF$column4.text (),
      NWF$column5.text ()
    );
  });

  noOfRows = parsedData.length;
  dataArray = parsedData;

  NWF$ ('.StatusLabel label').text ('Number of loaded rows: 0 / ' + noOfRows);

Next, once data is parsed, this function calls another function, that handles loading of data into the repeating section. The call of the “processRowsDataArray” function passes to it parsed data and time of a single run, together with definitions of two callback functions – one called after each row is added, second – once all rows are added.

/* Call of the function that gets created array from SQL and processes it to add rows to repeating section */
processRowsDataArray (dataArray,timeOfSingleRun,
    function () {
      date2 = new Date ();
      NWF$ ('.StatusLabel label').html (
        'Number of loaded rows: ' +
          currentRow +
          ' / ' +
          noOfRows +
          '<br/>Time elapsed (seconds): ' +
          (date2 - date1) / 1000
      );
    },
    function () {
      /* remove last, empty row, as it is always empty */
      NWF$ ('.repeatingSection .nf-repeater-row:last')
        .find ('.nf-repeater-deleterow-image')
        .click ();
      /* hide the "add new row" link */
      NWF$ ('.repeatingSection')
        .find ('.nf-repeater-addrow')
        .css ('visibility', 'hidden');

      dataLoaded = true;
    }
  );
}

The “processRowsDataArray” function is build using a pattern that allows simulation of asynchronous load for the synchronous functions. It uses the “timeout” functions to split the scope of data to be added into smaller chunks, thanks to which UI (especially in Internet Explorer) do not becomes totally unresponsive 

The function runs in a loop, until a timeout for a single run or number of total rows are reached. This allows script to act as asynchronous:

function processRowsDataArray(array, maxTimePerRow, callback, finished) {
    maxTimePerRow = maxTimePerRow || timeOfSingleRun;

    function now() {
        return new Date().getTime();
    }

    function addRow() {
        var startTime = now();

        while (currentRow < array.length && (now() - startTime) <= maxTimePerRow) {

               var i = currentRow;
  
               NWF$(".repeatingSection .nf-repeater-row:last").find('input.column1').val(array[i][1]);
               NWF$(".repeatingSection .nf-repeater-row:last").find('input.column2').val(array[i][2]);
               NWF$(".repeatingSection .nf-repeater-row:last").find('.column3 input').val(array[i][3]);
               NWF$(".repeatingSection .nf-repeater-row:last").find('.column4').val(array[i][4]);

            NWF$(".repeatingSection .nf-repeater-row:last").find('.column5 input').val(array[i][5]);
           
            /* remove image for row deletion */
            if (hideNativeRepeatingSectionControlls)
                NWF$(".repeatingSection .nf-repeater-row:last").find('.nf-repeater-deleterow-image').css("visibility", "hidden");

            /* add next row */
            NWF$(".repeatingSection").find('a').click();
            currentRow += 1;

            callback.call();
        }
        if (currentRow < array.length && currentRow < maxRows) {
            setTimeout(addRow, 1);
        }
        else {
            finished.call();
        }
    }
    addRow();
}

Results

The following recording shows how my solution is working. In Internet Explorer parsing of XML and loading it into repeating section works extremely slow. Therefore I have also added a solution, that splits the scope into smaller, 40-row chunks (for demo I changed it to 5), so that user can decide whether to load more or not:

Working example in IE

Final notes

What I am finding significant and important here to write is that this solution works differently in most popular browsers. This is related to the time of execution.

The fastest one is Chrome. Loading of 200 rows takes around 2 seconds. Edge needs around 17 seconds to complete. Meanwhile Internet Explorer… well… it needed almost 6 minutes in my tests!

If you would like to use this solution to work with large sets of data, I do not think you should do. Especially not in IE Maybe consider PowerApps or other approach for loading them into the form (without parsing them to the repeating section). Note, that this is a very time consuming job for the browser.

Outcomes