---
title: Cloudflare Pipelines as a Logpush destination
description: Send Cloudflare logs to Pipelines for SQL transformation and storage in R2 as Parquet or Apache Iceberg tables.
image: https://developers.cloudflare.com/changelog-preview.png
---

[Skip to content](#%5Ftop) 

# Changelog

New updates and improvements at Cloudflare.

[ Subscribe to RSS ](https://developers.cloudflare.com/changelog/rss/index.xml) [ View RSS feeds ](https://developers.cloudflare.com/fundamentals/new-features/available-rss-feeds/) 

![hero image](https://developers.cloudflare.com/_astro/hero.CVYJHPAd_26AMqX.svg) 

[ ← Back to all posts ](https://developers.cloudflare.com/changelog/) 

## Cloudflare Pipelines as a Logpush destination

Apr 20, 2026 

[ Logs ](https://developers.cloudflare.com/logs/)[ Pipelines ](https://developers.cloudflare.com/pipelines/) 

Logpush has traditionally been great at delivering Cloudflare logs to a variety of destinations in JSON format. While JSON is flexible and easily readable, it can be inefficient to store and query at scale.

With this release, you can now send your logs directly to [Pipelines](https://developers.cloudflare.com/pipelines/) to ingest, transform, and store your logs in [R2](https://developers.cloudflare.com/r2/) as Parquet files or Apache Iceberg tables managed by [R2 Data Catalog](https://developers.cloudflare.com/r2/data-catalog/). This makes the data footprint more compact and more efficient at querying your logs instantly with [R2 SQL](https://developers.cloudflare.com/r2-sql/) or any other query engine that supports Apache Iceberg or Parquet.

#### Transform logs before storage

Pipelines SQL runs on each log record in-flight, so you can reshape your data before it is written. For example, you can drop noisy fields, redact sensitive values, or derive new columns:

```

INSERT INTO http_logs_sink

SELECT

  ClientIP,

  EdgeResponseStatus,

  to_timestamp_micros(EdgeStartTimestamp) AS event_time,

  upper(ClientRequestMethod) AS method,

  sha256(ClientIP) AS hashed_ip

FROM http_logs_stream

WHERE EdgeResponseStatus >= 400;


```

Pipelines SQL supports string functions, regex, hashing, JSON extraction, timestamp conversion, conditional expressions, and more. For the full list, refer to the [Pipelines SQL reference](https://developers.cloudflare.com/pipelines/sql-reference/).

#### Get started

To configure Pipelines as a Logpush destination, refer to [Enable Cloudflare Pipelines](https://developers.cloudflare.com/logs/logpush/logpush-job/enable-destinations/pipelines/).