Skip to content
Start here

[DEPRECATED] Create Pipeline

Deprecated
client.pipelines.create(PipelineCreateParams { account_id, destination, name, source } params, RequestOptionsoptions?): PipelineCreateResponse { id, destination, endpoint, 3 more }
POST/accounts/{account_id}/pipelines

[DEPRECATED] Create a new pipeline. Use the new /pipelines/v1/pipelines endpoint instead.

Security
API Token

The preferred authorization scheme for interacting with the Cloudflare API. Create a token.

Example:Authorization: Bearer Sn3lZJTBX6kkg7OdcBUAxOO963GEIyGQqnFTOFYY
API Email + API Key

The previous authorization scheme for interacting with the Cloudflare API, used in conjunction with a Global API key.

Example:X-Auth-Email: user@example.com

The previous authorization scheme for interacting with the Cloudflare API. When possible, use API tokens instead of Global API keys.

Example:X-Auth-Key: 144c9defac04969c7bfad8efaa8ea194
Accepted Permissions (at least one required)
Pipelines Write
ParametersExpand Collapse
params: PipelineCreateParams { account_id, destination, name, source }
account_id: string

Path param: Specifies the public ID of the account.

destination: Destination

Body param

batch: Batch { max_bytes, max_duration_s, max_rows }
max_bytes?: number

Specifies rough maximum size of files.

maximum100000000
minimum1000
max_duration_s?: number

Specifies duration to wait to aggregate batches files.

maximum300
minimum0.25
max_rows?: number

Specifies rough maximum number of rows per file.

maximum10000000
minimum100
compression: Compression { type }
type?: "none" | "gzip" | "deflate"

Specifies the desired compression algorithm and format.

One of the following:
"none"
"gzip"
"deflate"
credentials: Credentials { access_key_id, endpoint, secret_access_key }
access_key_id: string

Specifies the R2 Bucket Access Key Id.

endpoint: string

Specifies the R2 Endpoint.

secret_access_key: string

Specifies the R2 Bucket Secret Access Key.

format: "json"

Specifies the format of data to deliver.

path: Path { bucket, filename, filepath, prefix }
bucket: string

Specifies the R2 Bucket to store files.

filename?: string

Specifies the name pattern to for individual data files.

filepath?: string

Specifies the name pattern for directory.

prefix?: string

Specifies the base directory within the bucket.

type: "r2"

Specifies the type of destination.

name: string

Body param: Defines the name of the pipeline.

maxLength128
minLength1
source: Array<CloudflarePipelinesWorkersPipelinesHTTPSource { format, type, authentication, cors } | CloudflarePipelinesWorkersPipelinesBindingSource { format, type } >

Body param

One of the following:
CloudflarePipelinesWorkersPipelinesHTTPSource { format, type, authentication, cors }

[DEPRECATED] HTTP source configuration. Use the new streams API instead.

format: "json"

Specifies the format of source data.

type: string
authentication?: boolean

Specifies whether authentication is required to send to this pipeline via HTTP.

cors?: CORS { origins }
origins?: Array<string>

Specifies allowed origins to allow Cross Origin HTTP Requests.

CloudflarePipelinesWorkersPipelinesBindingSource { format, type }

[DEPRECATED] Worker binding source configuration. Use the new streams API instead.

format: "json"

Specifies the format of source data.

type: string
ReturnsExpand Collapse
PipelineCreateResponse { id, destination, endpoint, 3 more }

[DEPRECATED] Describes the configuration of a pipeline. Use the new streams/sinks/pipelines API instead.

id: string

Specifies the pipeline identifier.

destination: Destination { batch, compression, format, 2 more }
batch: Batch { max_bytes, max_duration_s, max_rows }
max_bytes: number

Specifies rough maximum size of files.

maximum100000000
minimum1000
max_duration_s: number

Specifies duration to wait to aggregate batches files.

maximum300
minimum0.25
max_rows: number

Specifies rough maximum number of rows per file.

maximum10000000
minimum100
compression: Compression { type }
type: "none" | "gzip" | "deflate"

Specifies the desired compression algorithm and format.

One of the following:
"none"
"gzip"
"deflate"
format: "json"

Specifies the format of data to deliver.

path: Path { bucket, filename, filepath, prefix }
bucket: string

Specifies the R2 Bucket to store files.

filename?: string

Specifies the name pattern to for individual data files.

filepath?: string

Specifies the name pattern for directory.

prefix?: string

Specifies the base directory within the bucket.

type: "r2"

Specifies the type of destination.

endpoint: string

Indicates the endpoint URL to send traffic.

name: string

Defines the name of the pipeline.

maxLength128
minLength1
source: Array<CloudflarePipelinesWorkersPipelinesHTTPSource { format, type, authentication, cors } | CloudflarePipelinesWorkersPipelinesBindingSource { format, type } >
One of the following:
CloudflarePipelinesWorkersPipelinesHTTPSource { format, type, authentication, cors }

[DEPRECATED] HTTP source configuration. Use the new streams API instead.

format: "json"

Specifies the format of source data.

type: string
authentication?: boolean

Specifies whether authentication is required to send to this pipeline via HTTP.

cors?: CORS { origins }
origins?: Array<string>

Specifies allowed origins to allow Cross Origin HTTP Requests.

CloudflarePipelinesWorkersPipelinesBindingSource { format, type }

[DEPRECATED] Worker binding source configuration. Use the new streams API instead.

format: "json"

Specifies the format of source data.

type: string
version: number

Indicates the version number of last saved configuration.

[DEPRECATED] Create Pipeline

import Cloudflare from 'cloudflare';

const client = new Cloudflare({
  apiToken: process.env['CLOUDFLARE_API_TOKEN'], // This is the default and can be omitted
});

const pipeline = await client.pipelines.create({
  account_id: '0123105f4ecef8ad9ca31a8372d0c353',
  destination: {
    batch: {},
    compression: {},
    credentials: {
      access_key_id: '<access key id>',
      endpoint: 'https://123f8a8258064ed892a347f173372359.r2.cloudflarestorage.com',
      secret_access_key: '<secret key>',
    },
    format: 'json',
    path: { bucket: 'bucket' },
    type: 'r2',
  },
  name: 'sample_pipeline',
  source: [{ format: 'json', type: 'type' }],
});

console.log(pipeline.id);
{
  "result": {
    "id": "123f8a8258064ed892a347f173372359",
    "destination": {
      "batch": {
        "max_bytes": 1000,
        "max_duration_s": 0.25,
        "max_rows": 100
      },
      "compression": {
        "type": "gzip"
      },
      "format": "json",
      "path": {
        "bucket": "bucket",
        "filename": "${slug}${extension}",
        "filepath": "${date}/${hour}",
        "prefix": "base"
      },
      "type": "r2"
    },
    "endpoint": "https://123f8a8258064ed892a347f173372359.pipelines.cloudflare.com",
    "name": "sample_pipeline",
    "source": [
      {
        "format": "json",
        "type": "type",
        "authentication": true,
        "cors": {
          "origins": [
            "*"
          ]
        }
      }
    ],
    "version": 2
  },
  "success": true
}
Returns Examples
{
  "result": {
    "id": "123f8a8258064ed892a347f173372359",
    "destination": {
      "batch": {
        "max_bytes": 1000,
        "max_duration_s": 0.25,
        "max_rows": 100
      },
      "compression": {
        "type": "gzip"
      },
      "format": "json",
      "path": {
        "bucket": "bucket",
        "filename": "${slug}${extension}",
        "filepath": "${date}/${hour}",
        "prefix": "base"
      },
      "type": "r2"
    },
    "endpoint": "https://123f8a8258064ed892a347f173372359.pipelines.cloudflare.com",
    "name": "sample_pipeline",
    "source": [
      {
        "format": "json",
        "type": "type",
        "authentication": true,
        "cors": {
          "origins": [
            "*"
          ]
        }
      }
    ],
    "version": 2
  },
  "success": true
}