Skip to main content

Hello, 

 

I am trying to create a smart object to bulk upload a csv file to a database to allow users to upload and download multiple data points at once. Does anyone have any experience doing this? Do you know what method could be used?

 

The sql code would hopefully follow 

BULK INSERT E database_name . e schema_name ] . | schema_name . ] . table_name | view_name ] FROM 'data_file' l WITH ( H ( , ] BATCHSIZE = batch_size ] e ] , ] CHECK_CONSTRAINTS ] S ] , ] CODEPAGE = { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ] } ] , ] DATAFILETYPE = { 'char' | 'native'| 'widechar' | 'widenative' } ] } ] , ] DATASOURCE = 'data_source_name' ] ' ] , ] ERRORFILE = 'file_name' ] ' ] , ] ERRORFILE_DATASOURCE = 'data_source_name' ] ' ] , ] FIRSTROW = first_row ] w ] , ] FIRE_TRIGGERS ] S ] , ] FORMATFILE_DATASOURCE = 'data_source_name' ] ' ] , ] KEEPIDENTITY ] Y ] , ] KEEPNULLS ] S ] , ] KILOBYTES_PER_BATCH = kilobytes_per_batch ] h ] , ] LASTROW = last_row ] w ] , ] MAXERRORS = max_errors ] s ] , ] ORDER ( { column u ASC | DESC ] } ] ,...n ] ) ] ) ] , ] ROWS_PER_BATCH = rows_per_batch ] h ] , ] ROWTERMINATOR = 'row_terminator' ] ' ] , ] TABLOCK ] -- input file format options o s , ] FORMAT = 'CSV' ] ' ] , ] FIELDQUOTE = 'quote_characters'] s ] , ] FORMATFILE = 'format_file_path' ] ' ] , ] FIELDTERMINATOR = 'field_terminator' ] ' ] , ] ROWTERMINATOR = 'row_terminator' ] )]

 

However things like batch size would vary by upload. 

 

Any suggestions on this or other ways that users may bulk upload would be highly appreciated! Thanks, 

 

 

Interesting scenario.


One option would be to wrap this in a Stored procedure and then make use of the SQL Service to execute it. The only other option I can think of is to try to make use of the market PowerShell broker.


http://community.k2.com/t5/K2-blackpearl/PowerShell-Wizard/ba-p/981


http://community.k2.com/t5/K2-blackpearl/PowerShell-Service-Object/ba-p/1025


 


hope this helps


Vernon


Reply