Mount buckets
Mount S3-compatible object storage buckets as local filesystem paths. Access object storage using standard file operations.
Mount S3-compatible buckets when you need:
- Persistent data - Data survives sandbox destruction
- Large datasets - Process data without downloading
- Shared storage - Multiple sandboxes access the same data
- Cost-effective persistence - Cheaper than keeping sandboxes alive
import { getSandbox } from "@cloudflare/sandbox";
const sandbox = getSandbox(env.Sandbox, "data-processor");
// Mount R2 bucketawait sandbox.mountBucket("my-r2-bucket", "/data", { endpoint: "https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com",});
// Access bucket with standard filesystem operationsawait sandbox.exec("ls", { args: ["/data"] });await sandbox.writeFile("/data/results.json", JSON.stringify(results));
// Use from Pythonawait sandbox.exec("python", { args: [ "-c", `import pandas as pddf = pd.read_csv('/data/input.csv')df.describe().to_csv('/data/summary.csv')`, ],});import { getSandbox } from '@cloudflare/sandbox';
const sandbox = getSandbox(env.Sandbox, 'data-processor');
// Mount R2 bucketawait sandbox.mountBucket('my-r2-bucket', '/data', { endpoint: 'https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com'});
// Access bucket with standard filesystem operationsawait sandbox.exec('ls', { args: ['/data'] });await sandbox.writeFile('/data/results.json', JSON.stringify(results));
// Use from Pythonawait sandbox.exec('python', { args: ['-c', `import pandas as pddf = pd.read_csv('/data/input.csv')df.describe().to_csv('/data/summary.csv')`] });Set credentials as Worker secrets and the SDK automatically detects them:
npx wrangler secret put AWS_ACCESS_KEY_IDnpx wrangler secret put AWS_SECRET_ACCESS_KEY// Credentials automatically detected from environmentawait sandbox.mountBucket("my-r2-bucket", "/data", { endpoint: "https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com",});// Credentials automatically detected from environmentawait sandbox.mountBucket('my-r2-bucket', '/data', { endpoint: 'https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com'});Pass credentials directly when needed:
await sandbox.mountBucket("my-r2-bucket", "/data", { endpoint: "https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com", credentials: { accessKeyId: env.R2_ACCESS_KEY_ID, secretAccessKey: env.R2_SECRET_ACCESS_KEY, },});await sandbox.mountBucket('my-r2-bucket', '/data', { endpoint: 'https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com', credentials: { accessKeyId: env.R2_ACCESS_KEY_ID, secretAccessKey: env.R2_SECRET_ACCESS_KEY }});Protect data by mounting buckets in read-only mode:
await sandbox.mountBucket("dataset-bucket", "/data", { endpoint: "https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com", readOnly: true,});
// Reads workawait sandbox.exec("cat", { args: ["/data/dataset.csv"] });
// Writes failawait sandbox.writeFile("/data/new-file.txt", "data"); // Error: Read-only filesystemawait sandbox.mountBucket('dataset-bucket', '/data', { endpoint: 'https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com', readOnly: true});
// Reads workawait sandbox.exec('cat', { args: ['/data/dataset.csv'] });
// Writes failawait sandbox.writeFile('/data/new-file.txt', 'data'); // Error: Read-only filesystem// Mount for processingawait sandbox.mountBucket("my-bucket", "/data", { endpoint: "..." });
// Do workawait sandbox.exec("python process_data.py");
// Clean upawait sandbox.unmountBucket("/data");// Mount for processingawait sandbox.mountBucket('my-bucket', '/data', { endpoint: '...' });
// Do workawait sandbox.exec('python process_data.py');
// Clean upawait sandbox.unmountBucket('/data');The SDK supports any S3-compatible object storage. Here are examples for common providers:
await sandbox.mountBucket("my-s3-bucket", "/data", { endpoint: "https://s3.us-west-2.amazonaws.com", // Regional endpoint credentials: { accessKeyId: env.AWS_ACCESS_KEY_ID, secretAccessKey: env.AWS_SECRET_ACCESS_KEY, },});await sandbox.mountBucket('my-s3-bucket', '/data', { endpoint: 'https://s3.us-west-2.amazonaws.com', // Regional endpoint credentials: { accessKeyId: env.AWS_ACCESS_KEY_ID, secretAccessKey: env.AWS_SECRET_ACCESS_KEY }});await sandbox.mountBucket("my-gcs-bucket", "/data", { endpoint: "https://storage.googleapis.com", credentials: { accessKeyId: env.GCS_ACCESS_KEY_ID, // HMAC key secretAccessKey: env.GCS_SECRET_ACCESS_KEY, },});await sandbox.mountBucket('my-gcs-bucket', '/data', { endpoint: 'https://storage.googleapis.com', credentials: { accessKeyId: env.GCS_ACCESS_KEY_ID, // HMAC key secretAccessKey: env.GCS_SECRET_ACCESS_KEY }});For providers like Backblaze B2, MinIO, Wasabi, or others, use the standard mount pattern:
await sandbox.mountBucket("my-bucket", "/data", { endpoint: "https://s3.us-west-000.backblazeb2.com", // Provider-specific endpoint credentials: { accessKeyId: env.ACCESS_KEY_ID, secretAccessKey: env.SECRET_ACCESS_KEY, },});await sandbox.mountBucket('my-bucket', '/data', { endpoint: 'https://s3.us-west-000.backblazeb2.com', // Provider-specific endpoint credentials: { accessKeyId: env.ACCESS_KEY_ID, secretAccessKey: env.SECRET_ACCESS_KEY }});For provider-specific configuration, see the s3fs-fuse wiki ↗ which documents supported providers and their recommended flags.
Error: MissingCredentialsError: No credentials found
Solution: Set credentials as Worker secrets:
npx wrangler secret put AWS_ACCESS_KEY_IDnpx wrangler secret put AWS_SECRET_ACCESS_KEYError: S3FSMountError: mount failed
Common causes:
- Incorrect endpoint URL
- Invalid credentials
- Bucket doesn't exist
- Network connectivity issues
Verify your endpoint format and credentials:
try { await sandbox.mountBucket("my-bucket", "/data", { endpoint: "https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com", });} catch (error) { console.error("Mount failed:", error.message); // Check endpoint format, credentials, bucket existence}try { await sandbox.mountBucket('my-bucket', '/data', { endpoint: 'https://YOUR_ACCOUNT_ID.r2.cloudflarestorage.com' });} catch (error) { console.error('Mount failed:', error.message); // Check endpoint format, credentials, bucket existence}Error: InvalidMountConfigError: Mount path already in use
Solution: Unmount first or use a different path:
// Unmount existingawait sandbox.unmountBucket("/data");
// Or use different pathawait sandbox.mountBucket("bucket2", "/storage", { endpoint: "..." });// Unmount existingawait sandbox.unmountBucket('/data');
// Or use different pathawait sandbox.mountBucket('bucket2', '/storage', { endpoint: '...' });File operations on mounted buckets are slower than local filesystem due to network latency.
Solution: Copy frequently accessed files locally:
// Copy to local filesystemawait sandbox.exec("cp", { args: ["/data/large-dataset.csv", "/workspace/dataset.csv"],});
// Work with local copy (faster)await sandbox.exec("python", { args: ["process.py", "/workspace/dataset.csv"],});
// Save results back to bucketawait sandbox.exec("cp", { args: ["/workspace/results.json", "/data/results/output.json"],});// Copy to local filesystemawait sandbox.exec('cp', { args: ['/data/large-dataset.csv', '/workspace/dataset.csv'] });
// Work with local copy (faster)await sandbox.exec('python', { args: ['process.py', '/workspace/dataset.csv'] });
// Save results back to bucketawait sandbox.exec('cp', { args: ['/workspace/results.json', '/data/results/output.json'] });- Mount early - Mount buckets at sandbox initialization
- Use R2 for Cloudflare - Zero egress fees and optimized configuration
- Secure credentials - Always use Worker secrets, never hardcode
- Read-only when possible - Protect data with read-only mounts
- Mount paths - Use
/data,/storage, or/mnt/*(avoid/workspace,/tmp) - Handle errors - Wrap mount operations in try/catch blocks
- Optimize access - Copy frequently accessed files locally
- Persistent storage tutorial - Complete R2 example
- Storage API reference - Full method documentation
- Environment variables - Credential configuration
- R2 documentation - Learn about Cloudflare R2
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Directory
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- © 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark
-