# Pipelines ## [DEPRECATED] List Pipelines `pipelines.list(PipelineListParams**kwargs) -> PipelineListResponse` **get** `/accounts/{account_id}/pipelines` [DEPRECATED] List, filter, and paginate pipelines in an account. Use the new /pipelines/v1/pipelines endpoint instead. ### Parameters - `account_id: str` Specifies the public ID of the account. - `page: Optional[str]` Specifies which page to retrieve. - `per_page: Optional[str]` Specifies the number of pipelines per page. - `search: Optional[str]` Specifies the prefix of pipeline name to search. ### Returns - `class PipelineListResponse: …` - `result_info: ResultInfo` - `count: float` Indicates the number of items on current page. - `page: float` Indicates the current page number. - `per_page: float` Indicates the number of items per page. - `total_count: float` Indicates the total number of items. - `results: List[Result]` - `id: str` Specifies the pipeline identifier. - `destination: ResultDestination` - `batch: ResultDestinationBatch` - `max_bytes: int` Specifies rough maximum size of files. - `max_duration_s: float` Specifies duration to wait to aggregate batches files. - `max_rows: int` Specifies rough maximum number of rows per file. - `compression: ResultDestinationCompression` - `type: Literal["none", "gzip", "deflate"]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: ResultDestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `endpoint: str` Indicates the endpoint URL to send traffic. - `name: str` Defines the name of the pipeline. - `source: List[ResultSource]` - `class ResultSourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[ResultSourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[List[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class ResultSourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `version: float` Indicates the version number of last saved configuration. - `success: bool` Indicates whether the API call was successful. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) pipelines = client.pipelines.list( account_id="0123105f4ecef8ad9ca31a8372d0c353", ) print(pipelines.result_info) ``` #### Response ```json { "result_info": { "count": 1, "page": 0, "per_page": 10, "total_count": 1 }, "results": [ { "id": "123f8a8258064ed892a347f173372359", "destination": { "batch": { "max_bytes": 1000, "max_duration_s": 0.25, "max_rows": 100 }, "compression": { "type": "gzip" }, "format": "json", "path": { "bucket": "bucket", "filename": "${slug}${extension}", "filepath": "${date}/${hour}", "prefix": "base" }, "type": "r2" }, "endpoint": "https://123f8a8258064ed892a347f173372359.pipelines.cloudflare.com", "name": "sample_pipeline", "source": [ { "format": "json", "type": "type", "authentication": true, "cors": { "origins": [ "*" ] } } ], "version": 2 } ], "success": true } ``` ## [DEPRECATED] Get Pipeline `pipelines.get(strpipeline_name, PipelineGetParams**kwargs) -> PipelineGetResponse` **get** `/accounts/{account_id}/pipelines/{pipeline_name}` [DEPRECATED] Get configuration of a pipeline. Use the new /pipelines/v1/pipelines endpoint instead. ### Parameters - `account_id: str` Specifies the public ID of the account. - `pipeline_name: str` Defines the name of the pipeline. ### Returns - `class PipelineGetResponse: …` [DEPRECATED] Describes the configuration of a pipeline. Use the new streams/sinks/pipelines API instead. - `id: str` Specifies the pipeline identifier. - `destination: Destination` - `batch: DestinationBatch` - `max_bytes: int` Specifies rough maximum size of files. - `max_duration_s: float` Specifies duration to wait to aggregate batches files. - `max_rows: int` Specifies rough maximum number of rows per file. - `compression: DestinationCompression` - `type: Literal["none", "gzip", "deflate"]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: DestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `endpoint: str` Indicates the endpoint URL to send traffic. - `name: str` Defines the name of the pipeline. - `source: List[Source]` - `class SourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[SourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[List[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class SourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `version: float` Indicates the version number of last saved configuration. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) pipeline = client.pipelines.get( pipeline_name="sample_pipeline", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) print(pipeline.id) ``` #### Response ```json { "result": { "id": "123f8a8258064ed892a347f173372359", "destination": { "batch": { "max_bytes": 1000, "max_duration_s": 0.25, "max_rows": 100 }, "compression": { "type": "gzip" }, "format": "json", "path": { "bucket": "bucket", "filename": "${slug}${extension}", "filepath": "${date}/${hour}", "prefix": "base" }, "type": "r2" }, "endpoint": "https://123f8a8258064ed892a347f173372359.pipelines.cloudflare.com", "name": "sample_pipeline", "source": [ { "format": "json", "type": "type", "authentication": true, "cors": { "origins": [ "*" ] } } ], "version": 2 }, "success": true } ``` ## [DEPRECATED] Create Pipeline `pipelines.create(PipelineCreateParams**kwargs) -> PipelineCreateResponse` **post** `/accounts/{account_id}/pipelines` [DEPRECATED] Create a new pipeline. Use the new /pipelines/v1/pipelines endpoint instead. ### Parameters - `account_id: str` Specifies the public ID of the account. - `destination: Destination` - `batch: DestinationBatch` - `max_bytes: Optional[int]` Specifies rough maximum size of files. - `max_duration_s: Optional[float]` Specifies duration to wait to aggregate batches files. - `max_rows: Optional[int]` Specifies rough maximum number of rows per file. - `compression: DestinationCompression` - `type: Optional[Literal["none", "gzip", "deflate"]]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `credentials: DestinationCredentials` - `access_key_id: str` Specifies the R2 Bucket Access Key Id. - `endpoint: str` Specifies the R2 Endpoint. - `secret_access_key: str` Specifies the R2 Bucket Secret Access Key. - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: DestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `name: str` Defines the name of the pipeline. - `source: Iterable[Source]` - `class SourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[SourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[SequenceNotStr[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class SourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` ### Returns - `class PipelineCreateResponse: …` [DEPRECATED] Describes the configuration of a pipeline. Use the new streams/sinks/pipelines API instead. - `id: str` Specifies the pipeline identifier. - `destination: Destination` - `batch: DestinationBatch` - `max_bytes: int` Specifies rough maximum size of files. - `max_duration_s: float` Specifies duration to wait to aggregate batches files. - `max_rows: int` Specifies rough maximum number of rows per file. - `compression: DestinationCompression` - `type: Literal["none", "gzip", "deflate"]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: DestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `endpoint: str` Indicates the endpoint URL to send traffic. - `name: str` Defines the name of the pipeline. - `source: List[Source]` - `class SourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[SourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[List[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class SourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `version: float` Indicates the version number of last saved configuration. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) pipeline = client.pipelines.create( account_id="0123105f4ecef8ad9ca31a8372d0c353", destination={ "batch": {}, "compression": {}, "credentials": { "access_key_id": "", "endpoint": "https://123f8a8258064ed892a347f173372359.r2.cloudflarestorage.com", "secret_access_key": "", }, "format": "json", "path": { "bucket": "bucket" }, "type": "r2", }, name="sample_pipeline", source=[{ "format": "json", "type": "type", }], ) print(pipeline.id) ``` #### Response ```json { "result": { "id": "123f8a8258064ed892a347f173372359", "destination": { "batch": { "max_bytes": 1000, "max_duration_s": 0.25, "max_rows": 100 }, "compression": { "type": "gzip" }, "format": "json", "path": { "bucket": "bucket", "filename": "${slug}${extension}", "filepath": "${date}/${hour}", "prefix": "base" }, "type": "r2" }, "endpoint": "https://123f8a8258064ed892a347f173372359.pipelines.cloudflare.com", "name": "sample_pipeline", "source": [ { "format": "json", "type": "type", "authentication": true, "cors": { "origins": [ "*" ] } } ], "version": 2 }, "success": true } ``` ## [DEPRECATED] Update Pipeline `pipelines.update(strpipeline_name, PipelineUpdateParams**kwargs) -> PipelineUpdateResponse` **put** `/accounts/{account_id}/pipelines/{pipeline_name}` [DEPRECATED] Update an existing pipeline. Use the new /pipelines/v1/pipelines endpoint instead. ### Parameters - `account_id: str` Specifies the public ID of the account. - `pipeline_name: str` Defines the name of the pipeline. - `destination: Destination` - `batch: DestinationBatch` - `max_bytes: Optional[int]` Specifies rough maximum size of files. - `max_duration_s: Optional[float]` Specifies duration to wait to aggregate batches files. - `max_rows: Optional[int]` Specifies rough maximum number of rows per file. - `compression: DestinationCompression` - `type: Optional[Literal["none", "gzip", "deflate"]]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: DestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `credentials: Optional[DestinationCredentials]` - `access_key_id: str` Specifies the R2 Bucket Access Key Id. - `endpoint: str` Specifies the R2 Endpoint. - `secret_access_key: str` Specifies the R2 Bucket Secret Access Key. - `name: str` Defines the name of the pipeline. - `source: Iterable[Source]` - `class SourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[SourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[SequenceNotStr[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class SourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` ### Returns - `class PipelineUpdateResponse: …` [DEPRECATED] Describes the configuration of a pipeline. Use the new streams/sinks/pipelines API instead. - `id: str` Specifies the pipeline identifier. - `destination: Destination` - `batch: DestinationBatch` - `max_bytes: int` Specifies rough maximum size of files. - `max_duration_s: float` Specifies duration to wait to aggregate batches files. - `max_rows: int` Specifies rough maximum number of rows per file. - `compression: DestinationCompression` - `type: Literal["none", "gzip", "deflate"]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: DestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `endpoint: str` Indicates the endpoint URL to send traffic. - `name: str` Defines the name of the pipeline. - `source: List[Source]` - `class SourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[SourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[List[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class SourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `version: float` Indicates the version number of last saved configuration. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) pipeline = client.pipelines.update( pipeline_name="sample_pipeline", account_id="0123105f4ecef8ad9ca31a8372d0c353", destination={ "batch": {}, "compression": {}, "format": "json", "path": { "bucket": "bucket" }, "type": "r2", }, name="sample_pipeline", source=[{ "format": "json", "type": "type", }], ) print(pipeline.id) ``` #### Response ```json { "result": { "id": "123f8a8258064ed892a347f173372359", "destination": { "batch": { "max_bytes": 1000, "max_duration_s": 0.25, "max_rows": 100 }, "compression": { "type": "gzip" }, "format": "json", "path": { "bucket": "bucket", "filename": "${slug}${extension}", "filepath": "${date}/${hour}", "prefix": "base" }, "type": "r2" }, "endpoint": "https://123f8a8258064ed892a347f173372359.pipelines.cloudflare.com", "name": "sample_pipeline", "source": [ { "format": "json", "type": "type", "authentication": true, "cors": { "origins": [ "*" ] } } ], "version": 2 }, "success": true } ``` ## [DEPRECATED] Delete Pipeline `pipelines.delete(strpipeline_name, PipelineDeleteParams**kwargs)` **delete** `/accounts/{account_id}/pipelines/{pipeline_name}` [DEPRECATED] Delete a pipeline. Use the new /pipelines/v1/pipelines endpoint instead. ### Parameters - `account_id: str` Specifies the public ID of the account. - `pipeline_name: str` Defines the name of the pipeline. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) client.pipelines.delete( pipeline_name="sample_pipeline", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) ``` ## List Pipelines `pipelines.list_v1(PipelineListV1Params**kwargs) -> SyncV4PagePaginationArray[PipelineListV1Response]` **get** `/accounts/{account_id}/pipelines/v1/pipelines` List/Filter Pipelines in Account. ### Parameters - `account_id: str` Specifies the public ID of the account. - `page: Optional[float]` - `per_page: Optional[float]` ### Returns - `class PipelineListV1Response: …` - `id: str` Indicates a unique identifier for this pipeline. - `created_at: str` - `modified_at: str` - `name: str` Indicates the name of the Pipeline. - `sql: str` Specifies SQL for the Pipeline processing flow. - `status: str` Indicates the current status of the Pipeline. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) page = client.pipelines.list_v1( account_id="0123105f4ecef8ad9ca31a8372d0c353", ) page = page.result[0] print(page.id) ``` #### Response ```json { "result": [ { "id": "01234567890123457689012345678901", "created_at": "created_at", "modified_at": "modified_at", "name": "my_pipeline", "sql": "insert into sink select * from source;", "status": "status" } ], "result_info": { "count": 1, "page": 0, "per_page": 10, "total_count": 1 }, "success": true } ``` ## Get Pipeline Details `pipelines.get_v1(strpipeline_id, PipelineGetV1Params**kwargs) -> PipelineGetV1Response` **get** `/accounts/{account_id}/pipelines/v1/pipelines/{pipeline_id}` Get Pipelines Details. ### Parameters - `account_id: str` Specifies the public ID of the account. - `pipeline_id: str` Specifies the public ID of the pipeline. ### Returns - `class PipelineGetV1Response: …` - `id: str` Indicates a unique identifier for this pipeline. - `created_at: str` - `modified_at: str` - `name: str` Indicates the name of the Pipeline. - `sql: str` Specifies SQL for the Pipeline processing flow. - `status: str` Indicates the current status of the Pipeline. - `tables: List[Table]` List of streams and sinks used by this pipeline. - `id: str` Unique identifier for the connection (stream or sink). - `latest: int` Latest available version of the connection. - `name: str` Name of the connection. - `type: Literal["stream", "sink"]` Type of the connection. - `"stream"` - `"sink"` - `version: int` Current version of the connection used by this pipeline. - `failure_reason: Optional[str]` Indicates the reason for the failure of the Pipeline. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) response = client.pipelines.get_v1( pipeline_id="043e105f4ecef8ad9ca31a8372d0c353", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) print(response.id) ``` #### Response ```json { "result": { "id": "01234567890123457689012345678901", "created_at": "created_at", "modified_at": "modified_at", "name": "my_pipeline", "sql": "insert into sink select * from source;", "status": "status", "tables": [ { "id": "1c9200d5872c018bb34e93e2cd8a438e", "latest": 5, "name": "my_table", "type": "stream", "version": 4 } ], "failure_reason": "failure_reason" }, "success": true } ``` ## Create Pipeline `pipelines.create_v1(PipelineCreateV1Params**kwargs) -> PipelineCreateV1Response` **post** `/accounts/{account_id}/pipelines/v1/pipelines` Create a new Pipeline. ### Parameters - `account_id: str` Specifies the public ID of the account. - `name: str` Specifies the name of the Pipeline. - `sql: str` Specifies SQL for the Pipeline processing flow. ### Returns - `class PipelineCreateV1Response: …` - `id: str` Indicates a unique identifier for this pipeline. - `created_at: str` - `modified_at: str` - `name: str` Indicates the name of the Pipeline. - `sql: str` Specifies SQL for the Pipeline processing flow. - `status: str` Indicates the current status of the Pipeline. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) response = client.pipelines.create_v1( account_id="0123105f4ecef8ad9ca31a8372d0c353", name="my_pipeline", sql="insert into sink select * from source;", ) print(response.id) ``` #### Response ```json { "result": { "id": "01234567890123457689012345678901", "created_at": "created_at", "modified_at": "modified_at", "name": "my_pipeline", "sql": "insert into sink select * from source;", "status": "status" }, "success": true } ``` ## Delete Pipelines `pipelines.delete_v1(strpipeline_id, PipelineDeleteV1Params**kwargs)` **delete** `/accounts/{account_id}/pipelines/v1/pipelines/{pipeline_id}` Delete Pipeline in Account. ### Parameters - `account_id: str` Specifies the public ID of the account. - `pipeline_id: str` Specifies the public ID of the pipeline. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) client.pipelines.delete_v1( pipeline_id="043e105f4ecef8ad9ca31a8372d0c353", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) ``` ## Validate SQL `pipelines.validate_sql(PipelineValidateSqlParams**kwargs) -> PipelineValidateSqlResponse` **post** `/accounts/{account_id}/pipelines/v1/validate_sql` Validate Arroyo SQL. ### Parameters - `account_id: str` Specifies the public ID of the account. - `sql: str` Specifies SQL to validate. ### Returns - `class PipelineValidateSqlResponse: …` - `tables: Dict[str, Tables]` Indicates tables involved in the processing. - `id: str` - `name: str` - `type: str` - `version: float` - `graph: Optional[Graph]` - `edges: List[GraphEdge]` - `dest_id: int` - `edge_type: str` - `key_type: str` - `src_id: int` - `value_type: str` - `nodes: List[GraphNode]` - `description: str` - `node_id: int` - `operator: str` - `parallelism: int` ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) response = client.pipelines.validate_sql( account_id="0123105f4ecef8ad9ca31a8372d0c353", sql="insert into sink select * from source;", ) print(response.tables) ``` #### Response ```json { "result": { "tables": { "foo": { "id": "id", "name": "name", "type": "type", "version": 0 } }, "graph": { "edges": [ { "dest_id": 0, "edge_type": "edge_type", "key_type": "key_type", "src_id": 0, "value_type": "value_type" } ], "nodes": [ { "description": "description", "node_id": 0, "operator": "operator", "parallelism": 0 } ] } }, "success": true } ``` ## Domain Types ### Pipeline List Response - `class PipelineListResponse: …` - `result_info: ResultInfo` - `count: float` Indicates the number of items on current page. - `page: float` Indicates the current page number. - `per_page: float` Indicates the number of items per page. - `total_count: float` Indicates the total number of items. - `results: List[Result]` - `id: str` Specifies the pipeline identifier. - `destination: ResultDestination` - `batch: ResultDestinationBatch` - `max_bytes: int` Specifies rough maximum size of files. - `max_duration_s: float` Specifies duration to wait to aggregate batches files. - `max_rows: int` Specifies rough maximum number of rows per file. - `compression: ResultDestinationCompression` - `type: Literal["none", "gzip", "deflate"]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: ResultDestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `endpoint: str` Indicates the endpoint URL to send traffic. - `name: str` Defines the name of the pipeline. - `source: List[ResultSource]` - `class ResultSourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[ResultSourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[List[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class ResultSourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `version: float` Indicates the version number of last saved configuration. - `success: bool` Indicates whether the API call was successful. ### Pipeline Get Response - `class PipelineGetResponse: …` [DEPRECATED] Describes the configuration of a pipeline. Use the new streams/sinks/pipelines API instead. - `id: str` Specifies the pipeline identifier. - `destination: Destination` - `batch: DestinationBatch` - `max_bytes: int` Specifies rough maximum size of files. - `max_duration_s: float` Specifies duration to wait to aggregate batches files. - `max_rows: int` Specifies rough maximum number of rows per file. - `compression: DestinationCompression` - `type: Literal["none", "gzip", "deflate"]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: DestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `endpoint: str` Indicates the endpoint URL to send traffic. - `name: str` Defines the name of the pipeline. - `source: List[Source]` - `class SourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[SourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[List[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class SourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `version: float` Indicates the version number of last saved configuration. ### Pipeline Create Response - `class PipelineCreateResponse: …` [DEPRECATED] Describes the configuration of a pipeline. Use the new streams/sinks/pipelines API instead. - `id: str` Specifies the pipeline identifier. - `destination: Destination` - `batch: DestinationBatch` - `max_bytes: int` Specifies rough maximum size of files. - `max_duration_s: float` Specifies duration to wait to aggregate batches files. - `max_rows: int` Specifies rough maximum number of rows per file. - `compression: DestinationCompression` - `type: Literal["none", "gzip", "deflate"]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: DestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `endpoint: str` Indicates the endpoint URL to send traffic. - `name: str` Defines the name of the pipeline. - `source: List[Source]` - `class SourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[SourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[List[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class SourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `version: float` Indicates the version number of last saved configuration. ### Pipeline Update Response - `class PipelineUpdateResponse: …` [DEPRECATED] Describes the configuration of a pipeline. Use the new streams/sinks/pipelines API instead. - `id: str` Specifies the pipeline identifier. - `destination: Destination` - `batch: DestinationBatch` - `max_bytes: int` Specifies rough maximum size of files. - `max_duration_s: float` Specifies duration to wait to aggregate batches files. - `max_rows: int` Specifies rough maximum number of rows per file. - `compression: DestinationCompression` - `type: Literal["none", "gzip", "deflate"]` Specifies the desired compression algorithm and format. - `"none"` - `"gzip"` - `"deflate"` - `format: Literal["json"]` Specifies the format of data to deliver. - `"json"` - `path: DestinationPath` - `bucket: str` Specifies the R2 Bucket to store files. - `filename: Optional[str]` Specifies the name pattern to for individual data files. - `filepath: Optional[str]` Specifies the name pattern for directory. - `prefix: Optional[str]` Specifies the base directory within the bucket. - `type: Literal["r2"]` Specifies the type of destination. - `"r2"` - `endpoint: str` Indicates the endpoint URL to send traffic. - `name: str` Defines the name of the pipeline. - `source: List[Source]` - `class SourceCloudflarePipelinesWorkersPipelinesHTTPSource: …` [DEPRECATED] HTTP source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `authentication: Optional[bool]` Specifies whether authentication is required to send to this pipeline via HTTP. - `cors: Optional[SourceCloudflarePipelinesWorkersPipelinesHTTPSourceCORS]` - `origins: Optional[List[str]]` Specifies allowed origins to allow Cross Origin HTTP Requests. - `class SourceCloudflarePipelinesWorkersPipelinesBindingSource: …` [DEPRECATED] Worker binding source configuration. Use the new streams API instead. - `format: Literal["json"]` Specifies the format of source data. - `"json"` - `type: str` - `version: float` Indicates the version number of last saved configuration. ### Pipeline List V1 Response - `class PipelineListV1Response: …` - `id: str` Indicates a unique identifier for this pipeline. - `created_at: str` - `modified_at: str` - `name: str` Indicates the name of the Pipeline. - `sql: str` Specifies SQL for the Pipeline processing flow. - `status: str` Indicates the current status of the Pipeline. ### Pipeline Get V1 Response - `class PipelineGetV1Response: …` - `id: str` Indicates a unique identifier for this pipeline. - `created_at: str` - `modified_at: str` - `name: str` Indicates the name of the Pipeline. - `sql: str` Specifies SQL for the Pipeline processing flow. - `status: str` Indicates the current status of the Pipeline. - `tables: List[Table]` List of streams and sinks used by this pipeline. - `id: str` Unique identifier for the connection (stream or sink). - `latest: int` Latest available version of the connection. - `name: str` Name of the connection. - `type: Literal["stream", "sink"]` Type of the connection. - `"stream"` - `"sink"` - `version: int` Current version of the connection used by this pipeline. - `failure_reason: Optional[str]` Indicates the reason for the failure of the Pipeline. ### Pipeline Create V1 Response - `class PipelineCreateV1Response: …` - `id: str` Indicates a unique identifier for this pipeline. - `created_at: str` - `modified_at: str` - `name: str` Indicates the name of the Pipeline. - `sql: str` Specifies SQL for the Pipeline processing flow. - `status: str` Indicates the current status of the Pipeline. ### Pipeline Validate Sql Response - `class PipelineValidateSqlResponse: …` - `tables: Dict[str, Tables]` Indicates tables involved in the processing. - `id: str` - `name: str` - `type: str` - `version: float` - `graph: Optional[Graph]` - `edges: List[GraphEdge]` - `dest_id: int` - `edge_type: str` - `key_type: str` - `src_id: int` - `value_type: str` - `nodes: List[GraphNode]` - `description: str` - `node_id: int` - `operator: str` - `parallelism: int` # Sinks ## List Sinks `pipelines.sinks.list(SinkListParams**kwargs) -> SyncV4PagePaginationArray[SinkListResponse]` **get** `/accounts/{account_id}/pipelines/v1/sinks` List/Filter Sinks in Account. ### Parameters - `account_id: str` Specifies the public ID of the account. - `page: Optional[float]` - `per_page: Optional[float]` - `pipeline_id: Optional[str]` ### Returns - `class SinkListResponse: …` - `id: str` Indicates a unique identifier for this sink. - `created_at: datetime` - `modified_at: datetime` - `name: str` Defines the name of the Sink. - `type: Literal["r2", "r2_data_catalog"]` Specifies the type of sink. - `"r2"` - `"r2_data_catalog"` - `config: Optional[Config]` Defines the configuration of the R2 Sink. - `class ConfigCloudflarePipelinesR2TablePublic: …` R2 Sink public configuration. - `account_id: str` Cloudflare Account ID for the bucket - `bucket: str` R2 Bucket to write to - `file_naming: Optional[ConfigCloudflarePipelinesR2TablePublicFileNaming]` Controls filename prefix/suffix and strategy. - `prefix: Optional[str]` The prefix to use in file name. i.e prefix-.parquet - `strategy: Optional[Literal["serial", "uuid", "uuid_v7", "ulid"]]` Filename generation strategy. - `"serial"` - `"uuid"` - `"uuid_v7"` - `"ulid"` - `suffix: Optional[str]` This will overwrite the default file suffix. i.e .parquet, use with caution - `jurisdiction: Optional[str]` Jurisdiction this bucket is hosted in - `partitioning: Optional[ConfigCloudflarePipelinesR2TablePublicPartitioning]` Data-layout partitioning for sinks. - `time_pattern: Optional[str]` The pattern of the date string - `path: Optional[str]` Subpath within the bucket to write to - `rolling_policy: Optional[ConfigCloudflarePipelinesR2TablePublicRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `class ConfigCloudflarePipelinesR2DataCatalogTablePublic: …` R2 Data Catalog Sink public configuration. - `account_id: str` Cloudflare Account ID - `bucket: str` The R2 Bucket that hosts this catalog - `table_name: str` Table name - `namespace: Optional[str]` Table namespace - `rolling_policy: Optional[ConfigCloudflarePipelinesR2DataCatalogTablePublicRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) page = client.pipelines.sinks.list( account_id="0123105f4ecef8ad9ca31a8372d0c353", ) page = page.result[0] print(page.id) ``` #### Response ```json { "result": [ { "id": "01234567890123457689012345678901", "created_at": "2019-12-27T18:11:19.117Z", "modified_at": "2019-12-27T18:11:19.117Z", "name": "my_sink", "type": "r2", "config": { "account_id": "account_id", "bucket": "bucket", "file_naming": { "prefix": "prefix", "strategy": "serial", "suffix": "suffix" }, "jurisdiction": "jurisdiction", "partitioning": { "time_pattern": "year=%Y/month=%m/day=%d/hour=%H" }, "path": "path", "rolling_policy": { "file_size_bytes": 0, "inactivity_seconds": 1, "interval_seconds": 1 } }, "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "schema": { "fields": [ { "type": "int32", "metadata_key": "metadata_key", "name": "name", "required": true, "sql_name": "sql_name" } ], "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "inferred": true } } ], "result_info": { "count": 1, "page": 0, "per_page": 10, "total_count": 1 }, "success": true } ``` ## Get Sink Details `pipelines.sinks.get(strsink_id, SinkGetParams**kwargs) -> SinkGetResponse` **get** `/accounts/{account_id}/pipelines/v1/sinks/{sink_id}` Get Sink Details. ### Parameters - `account_id: str` Specifies the public ID of the account. - `sink_id: str` Specifies the publid ID of the sink. ### Returns - `class SinkGetResponse: …` - `id: str` Indicates a unique identifier for this sink. - `created_at: datetime` - `modified_at: datetime` - `name: str` Defines the name of the Sink. - `type: Literal["r2", "r2_data_catalog"]` Specifies the type of sink. - `"r2"` - `"r2_data_catalog"` - `config: Optional[Config]` Defines the configuration of the R2 Sink. - `class ConfigCloudflarePipelinesR2TablePublic: …` R2 Sink public configuration. - `account_id: str` Cloudflare Account ID for the bucket - `bucket: str` R2 Bucket to write to - `file_naming: Optional[ConfigCloudflarePipelinesR2TablePublicFileNaming]` Controls filename prefix/suffix and strategy. - `prefix: Optional[str]` The prefix to use in file name. i.e prefix-.parquet - `strategy: Optional[Literal["serial", "uuid", "uuid_v7", "ulid"]]` Filename generation strategy. - `"serial"` - `"uuid"` - `"uuid_v7"` - `"ulid"` - `suffix: Optional[str]` This will overwrite the default file suffix. i.e .parquet, use with caution - `jurisdiction: Optional[str]` Jurisdiction this bucket is hosted in - `partitioning: Optional[ConfigCloudflarePipelinesR2TablePublicPartitioning]` Data-layout partitioning for sinks. - `time_pattern: Optional[str]` The pattern of the date string - `path: Optional[str]` Subpath within the bucket to write to - `rolling_policy: Optional[ConfigCloudflarePipelinesR2TablePublicRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `class ConfigCloudflarePipelinesR2DataCatalogTablePublic: …` R2 Data Catalog Sink public configuration. - `account_id: str` Cloudflare Account ID - `bucket: str` The R2 Bucket that hosts this catalog - `table_name: str` Table name - `namespace: Optional[str]` Table namespace - `rolling_policy: Optional[ConfigCloudflarePipelinesR2DataCatalogTablePublicRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) sink = client.pipelines.sinks.get( sink_id="0223105f4ecef8ad9ca31a8372d0c353", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) print(sink.id) ``` #### Response ```json { "result": { "id": "01234567890123457689012345678901", "created_at": "2019-12-27T18:11:19.117Z", "modified_at": "2019-12-27T18:11:19.117Z", "name": "my_sink", "type": "r2", "config": { "account_id": "account_id", "bucket": "bucket", "file_naming": { "prefix": "prefix", "strategy": "serial", "suffix": "suffix" }, "jurisdiction": "jurisdiction", "partitioning": { "time_pattern": "year=%Y/month=%m/day=%d/hour=%H" }, "path": "path", "rolling_policy": { "file_size_bytes": 0, "inactivity_seconds": 1, "interval_seconds": 1 } }, "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "schema": { "fields": [ { "type": "int32", "metadata_key": "metadata_key", "name": "name", "required": true, "sql_name": "sql_name" } ], "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "inferred": true } }, "success": true } ``` ## Create Sink `pipelines.sinks.create(SinkCreateParams**kwargs) -> SinkCreateResponse` **post** `/accounts/{account_id}/pipelines/v1/sinks` Create a new Sink. ### Parameters - `account_id: str` Specifies the public ID of the account. - `name: str` Defines the name of the Sink. - `type: Literal["r2", "r2_data_catalog"]` Specifies the type of sink. - `"r2"` - `"r2_data_catalog"` - `config: Optional[Config]` Defines the configuration of the R2 Sink. - `class ConfigCloudflarePipelinesR2Table: …` - `account_id: str` Cloudflare Account ID for the bucket - `bucket: str` R2 Bucket to write to - `credentials: ConfigCloudflarePipelinesR2TableCredentials` - `access_key_id: str` Cloudflare Account ID for the bucket - `secret_access_key: str` Cloudflare Account ID for the bucket - `file_naming: Optional[ConfigCloudflarePipelinesR2TableFileNaming]` Controls filename prefix/suffix and strategy. - `prefix: Optional[str]` The prefix to use in file name. i.e prefix-.parquet - `strategy: Optional[Literal["serial", "uuid", "uuid_v7", "ulid"]]` Filename generation strategy. - `"serial"` - `"uuid"` - `"uuid_v7"` - `"ulid"` - `suffix: Optional[str]` This will overwrite the default file suffix. i.e .parquet, use with caution - `jurisdiction: Optional[str]` Jurisdiction this bucket is hosted in - `partitioning: Optional[ConfigCloudflarePipelinesR2TablePartitioning]` Data-layout partitioning for sinks. - `time_pattern: Optional[str]` The pattern of the date string - `path: Optional[str]` Subpath within the bucket to write to - `rolling_policy: Optional[ConfigCloudflarePipelinesR2TableRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `class ConfigCloudflarePipelinesR2DataCatalogTable: …` R2 Data Catalog Sink - `token: str` Authentication token - `account_id: str` Cloudflare Account ID - `bucket: str` The R2 Bucket that hosts this catalog - `table_name: str` Table name - `namespace: Optional[str]` Table namespace - `rolling_policy: Optional[ConfigCloudflarePipelinesR2DataCatalogTableRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[Iterable[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Returns - `class SinkCreateResponse: …` - `id: str` Indicates a unique identifier for this sink. - `created_at: datetime` - `modified_at: datetime` - `name: str` Defines the name of the Sink. - `type: Literal["r2", "r2_data_catalog"]` Specifies the type of sink. - `"r2"` - `"r2_data_catalog"` - `config: Optional[Config]` R2 Data Catalog Sink - `class ConfigCloudflarePipelinesR2Table: …` - `account_id: str` Cloudflare Account ID for the bucket - `bucket: str` R2 Bucket to write to - `credentials: ConfigCloudflarePipelinesR2TableCredentials` - `access_key_id: str` Cloudflare Account ID for the bucket - `secret_access_key: str` Cloudflare Account ID for the bucket - `file_naming: Optional[ConfigCloudflarePipelinesR2TableFileNaming]` Controls filename prefix/suffix and strategy. - `prefix: Optional[str]` The prefix to use in file name. i.e prefix-.parquet - `strategy: Optional[Literal["serial", "uuid", "uuid_v7", "ulid"]]` Filename generation strategy. - `"serial"` - `"uuid"` - `"uuid_v7"` - `"ulid"` - `suffix: Optional[str]` This will overwrite the default file suffix. i.e .parquet, use with caution - `jurisdiction: Optional[str]` Jurisdiction this bucket is hosted in - `partitioning: Optional[ConfigCloudflarePipelinesR2TablePartitioning]` Data-layout partitioning for sinks. - `time_pattern: Optional[str]` The pattern of the date string - `path: Optional[str]` Subpath within the bucket to write to - `rolling_policy: Optional[ConfigCloudflarePipelinesR2TableRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `class ConfigCloudflarePipelinesR2DataCatalogTable: …` R2 Data Catalog Sink - `token: str` Authentication token - `account_id: str` Cloudflare Account ID - `bucket: str` The R2 Bucket that hosts this catalog - `table_name: str` Table name - `namespace: Optional[str]` Table namespace - `rolling_policy: Optional[ConfigCloudflarePipelinesR2DataCatalogTableRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) sink = client.pipelines.sinks.create( account_id="0123105f4ecef8ad9ca31a8372d0c353", name="my_sink", type="r2", ) print(sink.id) ``` #### Response ```json { "result": { "id": "01234567890123457689012345678901", "created_at": "2019-12-27T18:11:19.117Z", "modified_at": "2019-12-27T18:11:19.117Z", "name": "my_sink", "type": "r2", "config": { "account_id": "account_id", "bucket": "bucket", "credentials": { "access_key_id": "access_key_id", "secret_access_key": "secret_access_key" }, "file_naming": { "prefix": "prefix", "strategy": "serial", "suffix": "suffix" }, "jurisdiction": "jurisdiction", "partitioning": { "time_pattern": "year=%Y/month=%m/day=%d/hour=%H" }, "path": "path", "rolling_policy": { "file_size_bytes": 0, "inactivity_seconds": 1, "interval_seconds": 1 } }, "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "schema": { "fields": [ { "type": "int32", "metadata_key": "metadata_key", "name": "name", "required": true, "sql_name": "sql_name" } ], "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "inferred": true } }, "success": true } ``` ## Delete Sink `pipelines.sinks.delete(strsink_id, SinkDeleteParams**kwargs)` **delete** `/accounts/{account_id}/pipelines/v1/sinks/{sink_id}` Delete Pipeline in Account. ### Parameters - `account_id: str` Specifies the public ID of the account. - `sink_id: str` Specifies the publid ID of the sink. - `force: Optional[str]` Delete sink forcefully, including deleting any dependent pipelines. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) client.pipelines.sinks.delete( sink_id="0223105f4ecef8ad9ca31a8372d0c353", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) ``` ## Domain Types ### Sink List Response - `class SinkListResponse: …` - `id: str` Indicates a unique identifier for this sink. - `created_at: datetime` - `modified_at: datetime` - `name: str` Defines the name of the Sink. - `type: Literal["r2", "r2_data_catalog"]` Specifies the type of sink. - `"r2"` - `"r2_data_catalog"` - `config: Optional[Config]` Defines the configuration of the R2 Sink. - `class ConfigCloudflarePipelinesR2TablePublic: …` R2 Sink public configuration. - `account_id: str` Cloudflare Account ID for the bucket - `bucket: str` R2 Bucket to write to - `file_naming: Optional[ConfigCloudflarePipelinesR2TablePublicFileNaming]` Controls filename prefix/suffix and strategy. - `prefix: Optional[str]` The prefix to use in file name. i.e prefix-.parquet - `strategy: Optional[Literal["serial", "uuid", "uuid_v7", "ulid"]]` Filename generation strategy. - `"serial"` - `"uuid"` - `"uuid_v7"` - `"ulid"` - `suffix: Optional[str]` This will overwrite the default file suffix. i.e .parquet, use with caution - `jurisdiction: Optional[str]` Jurisdiction this bucket is hosted in - `partitioning: Optional[ConfigCloudflarePipelinesR2TablePublicPartitioning]` Data-layout partitioning for sinks. - `time_pattern: Optional[str]` The pattern of the date string - `path: Optional[str]` Subpath within the bucket to write to - `rolling_policy: Optional[ConfigCloudflarePipelinesR2TablePublicRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `class ConfigCloudflarePipelinesR2DataCatalogTablePublic: …` R2 Data Catalog Sink public configuration. - `account_id: str` Cloudflare Account ID - `bucket: str` The R2 Bucket that hosts this catalog - `table_name: str` Table name - `namespace: Optional[str]` Table namespace - `rolling_policy: Optional[ConfigCloudflarePipelinesR2DataCatalogTablePublicRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Sink Get Response - `class SinkGetResponse: …` - `id: str` Indicates a unique identifier for this sink. - `created_at: datetime` - `modified_at: datetime` - `name: str` Defines the name of the Sink. - `type: Literal["r2", "r2_data_catalog"]` Specifies the type of sink. - `"r2"` - `"r2_data_catalog"` - `config: Optional[Config]` Defines the configuration of the R2 Sink. - `class ConfigCloudflarePipelinesR2TablePublic: …` R2 Sink public configuration. - `account_id: str` Cloudflare Account ID for the bucket - `bucket: str` R2 Bucket to write to - `file_naming: Optional[ConfigCloudflarePipelinesR2TablePublicFileNaming]` Controls filename prefix/suffix and strategy. - `prefix: Optional[str]` The prefix to use in file name. i.e prefix-.parquet - `strategy: Optional[Literal["serial", "uuid", "uuid_v7", "ulid"]]` Filename generation strategy. - `"serial"` - `"uuid"` - `"uuid_v7"` - `"ulid"` - `suffix: Optional[str]` This will overwrite the default file suffix. i.e .parquet, use with caution - `jurisdiction: Optional[str]` Jurisdiction this bucket is hosted in - `partitioning: Optional[ConfigCloudflarePipelinesR2TablePublicPartitioning]` Data-layout partitioning for sinks. - `time_pattern: Optional[str]` The pattern of the date string - `path: Optional[str]` Subpath within the bucket to write to - `rolling_policy: Optional[ConfigCloudflarePipelinesR2TablePublicRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `class ConfigCloudflarePipelinesR2DataCatalogTablePublic: …` R2 Data Catalog Sink public configuration. - `account_id: str` Cloudflare Account ID - `bucket: str` The R2 Bucket that hosts this catalog - `table_name: str` Table name - `namespace: Optional[str]` Table namespace - `rolling_policy: Optional[ConfigCloudflarePipelinesR2DataCatalogTablePublicRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Sink Create Response - `class SinkCreateResponse: …` - `id: str` Indicates a unique identifier for this sink. - `created_at: datetime` - `modified_at: datetime` - `name: str` Defines the name of the Sink. - `type: Literal["r2", "r2_data_catalog"]` Specifies the type of sink. - `"r2"` - `"r2_data_catalog"` - `config: Optional[Config]` R2 Data Catalog Sink - `class ConfigCloudflarePipelinesR2Table: …` - `account_id: str` Cloudflare Account ID for the bucket - `bucket: str` R2 Bucket to write to - `credentials: ConfigCloudflarePipelinesR2TableCredentials` - `access_key_id: str` Cloudflare Account ID for the bucket - `secret_access_key: str` Cloudflare Account ID for the bucket - `file_naming: Optional[ConfigCloudflarePipelinesR2TableFileNaming]` Controls filename prefix/suffix and strategy. - `prefix: Optional[str]` The prefix to use in file name. i.e prefix-.parquet - `strategy: Optional[Literal["serial", "uuid", "uuid_v7", "ulid"]]` Filename generation strategy. - `"serial"` - `"uuid"` - `"uuid_v7"` - `"ulid"` - `suffix: Optional[str]` This will overwrite the default file suffix. i.e .parquet, use with caution - `jurisdiction: Optional[str]` Jurisdiction this bucket is hosted in - `partitioning: Optional[ConfigCloudflarePipelinesR2TablePartitioning]` Data-layout partitioning for sinks. - `time_pattern: Optional[str]` The pattern of the date string - `path: Optional[str]` Subpath within the bucket to write to - `rolling_policy: Optional[ConfigCloudflarePipelinesR2TableRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `class ConfigCloudflarePipelinesR2DataCatalogTable: …` R2 Data Catalog Sink - `token: str` Authentication token - `account_id: str` Cloudflare Account ID - `bucket: str` The R2 Bucket that hosts this catalog - `table_name: str` Table name - `namespace: Optional[str]` Table namespace - `rolling_policy: Optional[ConfigCloudflarePipelinesR2DataCatalogTableRollingPolicy]` Rolling policy for file sinks (when & why to close a file and open a new one). - `file_size_bytes: Optional[int]` Files will be rolled after reaching this number of bytes - `inactivity_seconds: Optional[int]` Number of seconds of inactivity to wait before rolling over to a new file - `interval_seconds: Optional[int]` Number of seconds to wait before rolling over to a new file - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` # Streams ## List Streams `pipelines.streams.list(StreamListParams**kwargs) -> SyncV4PagePaginationArray[StreamListResponse]` **get** `/accounts/{account_id}/pipelines/v1/streams` List/Filter Streams in Account. ### Parameters - `account_id: str` Specifies the public ID of the account. - `page: Optional[float]` - `per_page: Optional[float]` - `pipeline_id: Optional[str]` Specifies the public ID of the pipeline. ### Returns - `class StreamListResponse: …` - `id: str` Indicates a unique identifier for this stream. - `created_at: datetime` - `http: HTTP` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[List[str]]` - `modified_at: datetime` - `name: str` Indicates the name of the Stream. - `version: int` Indicates the current version of this stream. - `worker_binding: WorkerBinding` - `enabled: bool` Indicates that the worker binding is enabled. - `endpoint: Optional[str]` Indicates the endpoint URL of this stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) page = client.pipelines.streams.list( account_id="0123105f4ecef8ad9ca31a8372d0c353", ) page = page.result[0] print(page.id) ``` #### Response ```json { "result": [ { "id": "01234567890123457689012345678901", "created_at": "2019-12-27T18:11:19.117Z", "http": { "authentication": false, "enabled": true, "cors": { "origins": [ "string" ] } }, "modified_at": "2019-12-27T18:11:19.117Z", "name": "my_stream", "version": 3, "worker_binding": { "enabled": true }, "endpoint": "https://01234567890123457689012345678901.ingest.cloudflare.com/v1", "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "schema": { "fields": [ { "type": "int32", "metadata_key": "metadata_key", "name": "name", "required": true, "sql_name": "sql_name" } ], "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "inferred": true } } ], "result_info": { "count": 1, "page": 0, "per_page": 10, "total_count": 1 }, "success": true } ``` ## Get Stream Details `pipelines.streams.get(strstream_id, StreamGetParams**kwargs) -> StreamGetResponse` **get** `/accounts/{account_id}/pipelines/v1/streams/{stream_id}` Get Stream Details. ### Parameters - `account_id: str` Specifies the public ID of the account. - `stream_id: str` Specifies the public ID of the stream. ### Returns - `class StreamGetResponse: …` - `id: str` Indicates a unique identifier for this stream. - `created_at: datetime` - `http: HTTP` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[List[str]]` - `modified_at: datetime` - `name: str` Indicates the name of the Stream. - `version: int` Indicates the current version of this stream. - `worker_binding: WorkerBinding` - `enabled: bool` Indicates that the worker binding is enabled. - `endpoint: Optional[str]` Indicates the endpoint URL of this stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) stream = client.pipelines.streams.get( stream_id="033e105f4ecef8ad9ca31a8372d0c353", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) print(stream.id) ``` #### Response ```json { "result": { "id": "01234567890123457689012345678901", "created_at": "2019-12-27T18:11:19.117Z", "http": { "authentication": false, "enabled": true, "cors": { "origins": [ "string" ] } }, "modified_at": "2019-12-27T18:11:19.117Z", "name": "my_stream", "version": 3, "worker_binding": { "enabled": true }, "endpoint": "https://01234567890123457689012345678901.ingest.cloudflare.com/v1", "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "schema": { "fields": [ { "type": "int32", "metadata_key": "metadata_key", "name": "name", "required": true, "sql_name": "sql_name" } ], "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "inferred": true } }, "success": true } ``` ## Create Stream `pipelines.streams.create(StreamCreateParams**kwargs) -> StreamCreateResponse` **post** `/accounts/{account_id}/pipelines/v1/streams` Create a new Stream. ### Parameters - `account_id: str` Specifies the public ID of the account. - `name: str` Specifies the name of the Stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `http: Optional[HTTP]` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[SequenceNotStr[str]]` - `schema: Optional[Schema]` - `fields: Optional[Iterable[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` - `worker_binding: Optional[WorkerBinding]` - `enabled: bool` Indicates that the worker binding is enabled. ### Returns - `class StreamCreateResponse: …` - `id: str` Indicates a unique identifier for this stream. - `created_at: datetime` - `http: HTTP` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[List[str]]` - `modified_at: datetime` - `name: str` Indicates the name of the Stream. - `version: int` Indicates the current version of this stream. - `worker_binding: WorkerBinding` - `enabled: bool` Indicates that the worker binding is enabled. - `endpoint: Optional[str]` Indicates the endpoint URL of this stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) stream = client.pipelines.streams.create( account_id="0123105f4ecef8ad9ca31a8372d0c353", name="my_stream", ) print(stream.id) ``` #### Response ```json { "result": { "id": "01234567890123457689012345678901", "created_at": "2019-12-27T18:11:19.117Z", "http": { "authentication": false, "enabled": true, "cors": { "origins": [ "string" ] } }, "modified_at": "2019-12-27T18:11:19.117Z", "name": "my_stream", "version": 3, "worker_binding": { "enabled": true }, "endpoint": "https://01234567890123457689012345678901.ingest.cloudflare.com/v1", "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "schema": { "fields": [ { "type": "int32", "metadata_key": "metadata_key", "name": "name", "required": true, "sql_name": "sql_name" } ], "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true }, "inferred": true } }, "success": true } ``` ## Update Stream `pipelines.streams.update(strstream_id, StreamUpdateParams**kwargs) -> StreamUpdateResponse` **patch** `/accounts/{account_id}/pipelines/v1/streams/{stream_id}` Update a Stream. ### Parameters - `account_id: str` Specifies the public ID of the account. - `stream_id: str` Specifies the public ID of the stream. - `http: Optional[HTTP]` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[SequenceNotStr[str]]` - `worker_binding: Optional[WorkerBinding]` - `enabled: bool` Indicates that the worker binding is enabled. ### Returns - `class StreamUpdateResponse: …` - `id: str` Indicates a unique identifier for this stream. - `created_at: datetime` - `http: HTTP` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[List[str]]` - `modified_at: datetime` - `name: str` Indicates the name of the Stream. - `version: int` Indicates the current version of this stream. - `worker_binding: WorkerBinding` - `enabled: bool` Indicates that the worker binding is enabled. - `endpoint: Optional[str]` Indicates the endpoint URL of this stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) stream = client.pipelines.streams.update( stream_id="033e105f4ecef8ad9ca31a8372d0c353", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) print(stream.id) ``` #### Response ```json { "result": { "id": "01234567890123457689012345678901", "created_at": "2019-12-27T18:11:19.117Z", "http": { "authentication": false, "enabled": true, "cors": { "origins": [ "string" ] } }, "modified_at": "2019-12-27T18:11:19.117Z", "name": "my_stream", "version": 3, "worker_binding": { "enabled": true }, "endpoint": "https://01234567890123457689012345678901.ingest.cloudflare.com/v1", "format": { "type": "json", "decimal_encoding": "number", "timestamp_format": "rfc3339", "unstructured": true } }, "success": true } ``` ## Delete Stream `pipelines.streams.delete(strstream_id, StreamDeleteParams**kwargs)` **delete** `/accounts/{account_id}/pipelines/v1/streams/{stream_id}` Delete Stream in Account. ### Parameters - `account_id: str` Specifies the public ID of the account. - `stream_id: str` Specifies the public ID of the stream. - `force: Optional[str]` Delete stream forcefully, including deleting any dependent pipelines. ### Example ```python import os from cloudflare import Cloudflare client = Cloudflare( api_token=os.environ.get("CLOUDFLARE_API_TOKEN"), # This is the default and can be omitted ) client.pipelines.streams.delete( stream_id="033e105f4ecef8ad9ca31a8372d0c353", account_id="0123105f4ecef8ad9ca31a8372d0c353", ) ``` ## Domain Types ### Stream List Response - `class StreamListResponse: …` - `id: str` Indicates a unique identifier for this stream. - `created_at: datetime` - `http: HTTP` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[List[str]]` - `modified_at: datetime` - `name: str` Indicates the name of the Stream. - `version: int` Indicates the current version of this stream. - `worker_binding: WorkerBinding` - `enabled: bool` Indicates that the worker binding is enabled. - `endpoint: Optional[str]` Indicates the endpoint URL of this stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Stream Get Response - `class StreamGetResponse: …` - `id: str` Indicates a unique identifier for this stream. - `created_at: datetime` - `http: HTTP` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[List[str]]` - `modified_at: datetime` - `name: str` Indicates the name of the Stream. - `version: int` Indicates the current version of this stream. - `worker_binding: WorkerBinding` - `enabled: bool` Indicates that the worker binding is enabled. - `endpoint: Optional[str]` Indicates the endpoint URL of this stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Stream Create Response - `class StreamCreateResponse: …` - `id: str` Indicates a unique identifier for this stream. - `created_at: datetime` - `http: HTTP` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[List[str]]` - `modified_at: datetime` - `name: str` Indicates the name of the Stream. - `version: int` Indicates the current version of this stream. - `worker_binding: WorkerBinding` - `enabled: bool` Indicates that the worker binding is enabled. - `endpoint: Optional[str]` Indicates the endpoint URL of this stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `schema: Optional[Schema]` - `fields: Optional[List[SchemaField]]` - `class SchemaFieldInt32: …` - `type: Literal["int32"]` - `"int32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldInt64: …` - `type: Literal["int64"]` - `"int64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat32: …` - `type: Literal["float32"]` - `"float32"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldFloat64: …` - `type: Literal["float64"]` - `"float64"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBool: …` - `type: Literal["bool"]` - `"bool"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldString: …` - `type: Literal["string"]` - `"string"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldBinary: …` - `type: Literal["binary"]` - `"binary"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldTimestamp: …` - `type: Literal["timestamp"]` - `"timestamp"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `unit: Optional[Literal["second", "millisecond", "microsecond", "nanosecond"]]` - `"second"` - `"millisecond"` - `"microsecond"` - `"nanosecond"` - `class SchemaFieldJson: …` - `type: Literal["json"]` - `"json"` - `metadata_key: Optional[str]` - `name: Optional[str]` - `required: Optional[bool]` - `sql_name: Optional[str]` - `class SchemaFieldStruct: …` - `class SchemaFieldList: …` - `format: Optional[SchemaFormat]` - `class SchemaFormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class SchemaFormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]` - `inferred: Optional[bool]` ### Stream Update Response - `class StreamUpdateResponse: …` - `id: str` Indicates a unique identifier for this stream. - `created_at: datetime` - `http: HTTP` - `authentication: bool` Indicates that authentication is required for the HTTP endpoint. - `enabled: bool` Indicates that the HTTP endpoint is enabled. - `cors: Optional[HTTPCORS]` Specifies the CORS options for the HTTP endpoint. - `origins: Optional[List[str]]` - `modified_at: datetime` - `name: str` Indicates the name of the Stream. - `version: int` Indicates the current version of this stream. - `worker_binding: WorkerBinding` - `enabled: bool` Indicates that the worker binding is enabled. - `endpoint: Optional[str]` Indicates the endpoint URL of this stream. - `format: Optional[Format]` - `class FormatJson: …` - `type: Literal["json"]` - `"json"` - `decimal_encoding: Optional[Literal["number", "string", "bytes"]]` - `"number"` - `"string"` - `"bytes"` - `timestamp_format: Optional[Literal["rfc3339", "unix_millis"]]` - `"rfc3339"` - `"unix_millis"` - `unstructured: Optional[bool]` - `class FormatParquet: …` - `type: Literal["parquet"]` - `"parquet"` - `compression: Optional[Literal["uncompressed", "snappy", "gzip", 2 more]]` - `"uncompressed"` - `"snappy"` - `"gzip"` - `"zstd"` - `"lz4"` - `row_group_bytes: Optional[int]`