Streams
The Streams API ↗ is a web standard API that allows JavaScript to programmatically access and process streams of data.
- ReadableStream
- ReadableStream BYOBReader
- ReadableStream DefaultReader
- TransformStream
- WritableStream
- WritableStream DefaultWriter
Use the Streams API to avoid buffering large requests or responses in memory. This enables you to parse extremely large request or response bodies within a Worker's 128 MB memory limit. This is faster than buffering the entire payload into memory, as your Worker can start processing data incrementally, and allows your Worker to handle multi-gigabyte payloads or files within its memory limits.
Workers do not need to prepare an entire response body before returning a Response. You can use a ReadableStream to stream a response body after sending the response status line and headers.
The worker can create a Response object using a ReadableStream as the body. Any data provided through the
ReadableStream will be streamed to the client as it becomes available.
export default { async fetch(request, env, ctx) { // Fetch from origin server. const response = await fetch(request);
// ... and deliver our Response while that’s running. return new Response(response.body, response); },};addEventListener("fetch", (event) => { event.respondWith(fetchAndStream(event.request));});
async function fetchAndStream(request) { // Fetch from origin server. const response = await fetch(request);
// ... and deliver our Response while that’s running. return new Response(readable.body, response);}A TransformStream and the ReadableStream.pipeTo() method can be used to modify the response body as it is being streamed:
export default { async fetch(request, env, ctx) { // Fetch from origin server. const response = await fetch(request);
const { readable, writable } = new TransformStream({ transform(chunk, controller) { controller.enqueue(modifyChunkSomehow(chunk)); }, });
// Start pumping the body. NOTE: No await! response.body.pipeTo(writable);
// ... and deliver our Response while that’s running. return new Response(readable, response); },};addEventListener("fetch", (event) => { event.respondWith(fetchAndStream(event.request));});
async function fetchAndStream(request) { // Fetch from origin server. const response = await fetch(request);
const { readable, writable } = new TransformStream({ transform(chunk, controller) { controller.enqueue(modifyChunkSomehow(chunk)); }, });
// Start pumping the body. NOTE: No await! response.body.pipeTo(writable);
// ... and deliver our Response while that’s running. return new Response(readable, response);}This example calls response.body.pipeTo(writable) but does not await it. This is so it does not block the forward progress of the remainder of the fetchAndStream() function. It continues to run asynchronously until the response is complete or the client disconnects.
The runtime can continue running a function (response.body.pipeTo(writable)) after a response is returned to the client. This example pumps the subrequest response body to the final response body. However, you can use more complicated logic, such as adding a prefix or a suffix to the body or to process it somehow.
- Stream large JSON - Parse and transform large JSON request and response bodies
- MDN's Streams API documentation ↗
- Streams API spec ↗
- Write your Worker code in ES modules syntax for an optimized experience.
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Directory
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- © 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark
-