---
title: Dropped event metrics, typed Pipelines bindings, and improved setup
description: Monitor dropped events in the dashboard and via GraphQL, generate schema-aware TypeScript types for pipeline bindings, and set up pipelines faster with Simple mode.
image: https://developers.cloudflare.com/changelog-preview.png
---

[Skip to content](#%5Ftop) 

# Changelog

New updates and improvements at Cloudflare.

[ Subscribe to RSS ](https://developers.cloudflare.com/changelog/rss/index.xml) [ View RSS feeds ](https://developers.cloudflare.com/fundamentals/new-features/available-rss-feeds/) 

![hero image](https://developers.cloudflare.com/_astro/hero.CVYJHPAd_26AMqX.svg) 

[ ← Back to all posts ](https://developers.cloudflare.com/changelog/) 

## Dropped event metrics, typed Pipelines bindings, and improved setup

Feb 24, 2026 

[ Pipelines ](https://developers.cloudflare.com/pipelines/)[ Workers ](https://developers.cloudflare.com/workers/) 

[Cloudflare Pipelines](https://developers.cloudflare.com/pipelines/) ingests streaming data via [Workers](https://developers.cloudflare.com/workers/) or HTTP endpoints, transforms it with SQL, and writes it to [R2](https://developers.cloudflare.com/r2/) as Apache Iceberg tables. Today we're shipping three improvements to help you understand why streaming events get dropped, catch data quality issues early, and set up Pipelines faster.

#### Dropped event metrics

When [stream](https://developers.cloudflare.com/pipelines/streams/) events don't match the expected schema, Pipelines accepts them during ingestion but drops them when attempting to deliver them to the [sink](https://developers.cloudflare.com/pipelines/sinks/). To help you identify the root cause of these issues, we are introducing a new dashboard and metrics that surface dropped events with detailed error messages.

![The Errors tab in the Cloudflare dashboard showing deserialization errors grouped by type with individual error details](https://developers.cloudflare.com/_astro/pipelines-error-log-dash.6JIa7r5d_Z1ILPxd.webp) 

Dropped events can also be queried programmatically via the new `pipelinesUserErrorsAdaptiveGroups` GraphQL dataset. The dataset breaks down failures by specific error type (`missing_field`, `type_mismatch`, `parse_failure`, or `null_value`) so you can trace issues back to the source.

```

query GetPipelineUserErrors(

  $accountTag: String!

  $pipelineId: String!

  $datetimeStart: Time!

  $datetimeEnd: Time!

) {

  viewer {

    accounts(filter: { accountTag: $accountTag }) {

      pipelinesUserErrorsAdaptiveGroups(

        limit: 100

        filter: {

          pipelineId: $pipelineId

          datetime_geq: $datetimeStart

          datetime_leq: $datetimeEnd

        }

        orderBy: [count_DESC]

      ) {

        count

        dimensions {

          errorFamily

          errorType

        }

      }

    }

  }

}


```

Explain Code

For the full list of dimensions, error types, and additional query examples, refer to [User error metrics](https://developers.cloudflare.com/pipelines/observability/metrics/#user-error-metrics).

#### Typed Pipelines bindings

Sending data to a Pipeline from a Worker previously used a generic `Pipeline<PipelineRecord>` type, which meant schema mismatches (wrong field names, incorrect types) were only caught at runtime as dropped events.

Running `wrangler types` now generates schema-specific TypeScript types for your [Pipeline bindings](https://developers.cloudflare.com/pipelines/streams/writing-to-streams/#send-via-workers). TypeScript catches missing required fields and incorrect field types at compile time, before your code is deployed.

TypeScript

```

declare namespace Cloudflare {

  type EcommerceStreamRecord = {

    user_id: string;

    event_type: string;

    product_id?: string;

    amount?: number;

  };

  interface Env {

    STREAM: import("cloudflare:pipelines").Pipeline<Cloudflare.EcommerceStreamRecord>;

  }

}


```

Explain Code

For more information, refer to [Typed Pipeline bindings](https://developers.cloudflare.com/pipelines/streams/writing-to-streams/#typed-pipeline-bindings).

#### Improved Pipelines setup

Setting up a new Pipeline previously required multiple manual steps: creating an R2 bucket, enabling R2 Data Catalog, generating an API token, and configuring format, compression, and rolling policies individually.

The `wrangler pipelines setup` command now offers a **Simple** setup mode that applies recommended defaults and automatically creates the [R2 bucket](https://developers.cloudflare.com/r2/buckets/) and enables [R2 Data Catalog](https://developers.cloudflare.com/r2/data-catalog/) if they do not already exist. Validation errors during setup prompt you to retry inline rather than restarting the entire process.

For a full walkthrough, refer to the [Getting started guide](https://developers.cloudflare.com/pipelines/getting-started/).