CSV Handler details

A comma-separated values (CSV) file is a delimited text file that uses a comma (other approved delimiters can be used instead) to separate values. Each line of the file is a data record and each record consists of one or more fields, separated by commas. A CSV file typically stores tabular data (numbers and text) in plain text, in which case each line will have the same number of fields.

The built-in CSV Handler channel is very similar to the JSON Handler channel in its handling of the source files. Both channels are extensible, designed to allow the addition of new capabilities and functionality, however only the CSV Handler will process files deliminated by semicolons, tabs, and spaces not just commas.

Channel details

CSV Handler is a built-in channel and will not count towards the channel entitlement quota. However there are some limitations when using the steps in this channel:

  • There is a 1MB total capacity. This is made up of a maximum of 0.95 MB for the compressed CSV file and 0.05 MB for the Headers, Query Params, and other fields.
  • A fetch will not continue after 10 minutes

CSV Sources category

Incoming CSV(Trigger step)

This step activates with the trigger event of making an HTTP request (with a CSV payload) to an auto-generated webhook endpoint. You can receive incoming CSV files, protect the endpoint with or without the Authentication schema field, or just receive the CSV by using any of the HTTP methods (POST, GET, PUT, PATCH, or DELETE). Currently the only supported authentication method for this step is Basic Authentication (set Username and Password).

The Incoming CSV step can filter the incoming request based on the request method type on two different levels:

    • At the system level
      • This happens even before the trigger is activated. You can activate system level using the configuration options.

    • At the design level
      • In the Pipeline designer, you can use the Add conditions button at the bottom of the step to branch the pipeline logic based on the request method type or to skip/drop the received request.

Available step fields

  • Authentication Schema - Basic Authentication only, and can restrict who can access auto-generated endpoint.
  • Incoming request's method type - Ignore incoming requests that do not match what you are expecting. Select POST, PUT, PATCH, or ANY BELOW. This filtering will not trigger the pipeline.
    Note: If you choose ANY BELOW as the type of request to ignore, you can Add conditions based on the export field Method. The following step could perform createlogic for the Method POST and updatelogic for the Method PUT.
  • Username - Enter the username, needed to access the file, here
  • Password - Enter the password associated with the username entered.

Fetch CSV(Action step)

Use this step to fetch a CSV file or resource from a URL.

  • File channels (those channels that work with files. e.g. Dropbox, Onedrive, Box, etc. ) are supported. Just use the pipelines://protocol - the link is stored in file_transfer_handle field.
  • If you are going to fetch files from a secure location you will need to enter credentials. The three supported authentication methods are:
    • Basic
    • Digest
    • Quickbase User Token

An important configuration option in this step is the Force Content Encoding field. Use this field when are dealing with unknown CSV content encoding.

Available step fields

  • *Authentication Schema - Supports Basic and Digest Authentication and Quickbase User Token. Based on selected Auth Schema, username and password or token fields will be marked as required.
    Note: If you are going to use the Quickbase RESTful API (https://api.quickbase.com/v1/) in combination with a Quickbase Usertoken Authentication schema you will need to specify this header QB-Realm-Hostname and set your realm hostname.
  • Outgoing request's method type - select the type of the request from a list of supported ones
  • Headers - use this field to add extra headers to the fetched request
  • Request Body - some API endpoints are expecting to receive a specific request body like this one https://developer.quickbase.com/operation/runQuery
  • Username - Enter the username needed to access the file.
  • Password - Enter the password associated with the username entered.
  • Token - Enter your Quickbase User Token
  • *Disable SSL Certificate Validation - Disables SSL/TLS certificate validation. This option is useful if you are making a request to a service authenticated with a self-signed certificate. It is best practice to leave it off in all other cases.
  • **CSV URL - API endpoint or URL which points to a CSV file. The link should be publicly available or behind an Authentication schema which the channels supports.
  • Force Content Encoding - Force the encoding used to decode the CSV document.
    • If left blank, Content-type header is taken into account.
    • If the header is not presented, utf-8 encoding is assumed.
    • If the header is specified attempts will be made to try to decode the CSV response using the selected one.

CSV Rows category

Iterate over CSV records (Query step)

Use this step to loop through each row of the CSV file, after fetching or receiving.

Available step fields

CSV Source - The target can point to a previous step, but limited to Incoming or Fetch CSV step.

Header line - Mark this field Yes or No to indicate whether or not the CSV file has an intended header row that contains names that correspond to the fields in the other rows.

Include the CSV header as a row - This field works in conjunction with the Header line field and will ignore the CSV header line of your file if Header line is marked Yes.

Row separator - select the character that is being used to separate rows.

Row quote char - By default, the character is a " (double quote) for CSV files. If you want to use a different character select one of the options in the drop down.

CSV header columns - Enter the names of each header separated by the same character selected in Row separator.

Limit - Limit the records which will be processed - for example if you have 100 records in the csv file and you limit it to 10, then only the first 10 records will be processed.

Use Cases

This can be used to send emails in a structured format for non-tabular objects for sources like Quick Books:

  • Specifying data types
    • If you know that the CSV contains a number and you need the subsequent steps to treat the data as a number, specify it while iterating, as in this example.

  • Specifying a different delimiter
    • If the file is not a comma separated file like this example, see the following step example so you can change this.

  • Accessing CSV that is behind authentication
    • We support various authentication mechanisms. We make it easy for you to access Quickbase data sources with UserTokens.
  • Filter the dataset
    • Once you receive the CSV dataset, you can add additional logic based on the dataset easily.