Choose a bulk data transfer method

Prev Next

This article helps you choose the right method for moving large sets of records in Pipelines. Each option supports different goals, data volumes, and design needs.

Import to Quickbase

Use the Import to Quickbase step when you want a streamlined way to load and transform large sets of data.

This method works well when you:

  • Need a simple pipeline design with fewer steps

  • Want clear visibility into when records are collected and committed

  • Need records to become available quickly during processing

  • Want structured error output of records that failed to import

  • Build pipelines that fit the Quick start pattern

Import to Quickbase commits records in batches during processing and preserves the order of records. It also returns failed records as an array, along with additional metadata, so you can design follow-up logic for retries or notifications.

Learn more about Import to Quickbase

Bulk record sets

Use bulk record sets when you need more control over how each record is handled.

This method works well when you:

  • Process smaller data sets where performance differences are minimal

  • Need to perform extra per-item handling and transformation using additional conditions or actions

Bulk record sets use a loop pattern, which lets you inspect and act on individual records. This flexibility can be useful when each record requires custom handling.

Learn more about bulk record sets

Copy records

Use the Copy records step when you want to move data quickly between apps in the same realm without changing it.

This method works well when you:

  • Transfer large volumes of records within the same Quickbase realm

  • Need to map fields where the input data is compatible with the destination field

  • Do not need field-level transformations

Copy records focuses on speed and efficiency for extract-and-load scenarios. It provides summary information about failures but does not return individual failed records.

Learn more about Copy records