CSV Handler details

Prev Next

Use the built-in CSV Handler in pipelines to fetch and process CSV files. The handler is extensible and allows you to process files that are delimited by semicolons, tabs, or spaces in addition to commas.

Year,Make,Model,Description,Price
1997,Ford,E350,"ac, abs, moon",3000.00
1999,Chevy,"Venture ""Extended Edition""","",4900.00
1999,Chevy,"Venture ""Extended Edition, Very Large""","",5000.00
1996,Jeep,Grand Cherokee,"MUST SELL!
air, moon roof, loaded",4799.00

This is a built-in channel; it will not count towards the channel entitlement quota.

Limits

  • 0.95 MB compressed CSV (the other 0.05 MB are left for Headers field)

  • 10 Minutes time to take fetch

Channel details

CSV Handler is a built-in channel and will not count towards the channel entitlement quota. However there are some limitations when using the steps in this channel:

  • The CSV values can be read as either a String, Number, or Integer.

  • There is a 1MB total capacity. This is made up of a maximum of 0.95 MB for the compressed CSV file and 0.05 MB for the Headers, Query Params, and other fields.

  • A fetch will not continue after 10 minutes

CSV Sources category

Incoming CSV (Trigger step)

Triggers when you make an HTTP request (with a CSV payload) to an auto-generated webhook endpoint.

Incoming CSV step configuration panel showing webhook endpoint URL and HTTP method settings

Receive CSV and protect the endpoint with or without the Authentication schema, or receive using any of the HTTP verbs.

Notes:

  • Currently the supported authentication methods are JWT Token and Basic Auth (set Username and Password)

  • We can filter the incoming requests based on the request method type on two different levels - 1st level is very low and it is before even the trigger is activated (system level and can be achieved using the configuration options), 2nd level is on the design level using the Add conditions and can be used to branch the logic based on the request method type or to skip/drop the received request.

JWT Auth

The Incoming CSV step supports authentication through JWT token.

JWT Token authentication option selected in the Incoming CSV step configuration

When the JWT Token option is selected, the user needs to provide a public key and also select a signing algorithm.

Currently, we support the following types of signing algorithms: RS256, RS384, RS512, ES256, ES384, ES512.

Signing algorithm selection showing RS256, RS384, RS512, ES256, ES384, and ES512 options

The expected public key format is:

-----BEGIN PUBLIC KEY-----
MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAu1SU1LfVLPHCozMxH2Mo
...
mwIDAQAB
-----END PUBLIC KEY-----

The pipeline gets triggered upon successful verification of the token. The output of the step includes the JWT payload in the form of an object that can be used in subsequent steps.

JWT payload output object displayed in the pipeline step results

If verification fails, the pipeline does NOT get triggered, instead we return response with status code 401 (Unauthorized).

Fetch CSV (Action step)

Use this step to fetch a CSV file or resource from a URL.

Fetch CSV step configuration panel showing URL, authentication, and encoding fields

Notes:

  • Supported Authentication methods are 3 types - Basic and Digest Auth and using QB User Token.

  • File based channels are also supported using the pipelines:// protocol (the link is stored in file_transfer_url field).

  • Important configuration option is Force Content Encoding - use it when you should deal with unknown CSV content encoding

CSV Rows category

Iterate over CSV records (Query step)

Iterate over CSV records. Use this step after fetch or receiving a CSV file.

Iterate over CSV records step configuration panel showing field mapping options

Use cases

This can be used to send emails in a structured format for non-tabular objects for sources like QuickBooks

Specifying data types

If you know that the CSV contains a number and you need the subsequent pipes to treat the data as a number, specify it while iterating

Iterate over CSV records step showing data type selection dropdown for a field

Pipeline step output showing CSV field values treated as specified numeric data type

Specify a different delimiter

If the file is not a comma-separated, you can change this

Delimiter configuration in the Iterate over CSV records step showing option to change from comma to another separator

Delimiter field set to an alternative separator in the Iterate over CSV records step configuration

Access CSV behind authentication

We support various authentication mechanisms. We make it easy for you to access Quickbase data sources with UserTokens.
Authentication configuration panel showing options for Quickbase UserToken and other authentication methods

Filter the dataset

Once you receive the CSV dataset, you can add additional logic based on the dataset easily.
Pipeline conditions panel showing filter logic added to process CSV dataset selectively