Skip to content
Cloudflare Docs

Interact with a Workflow

The Python Workers platform leverages FFI to access bindings to Cloudflare resources. Refer to the bindings documentation for more information.

From the configuration perspective, enabling Python Workflows requires adding the python_workflows compatibility flag to your wrangler.toml file.

{
"name": "workflows-starter",
"main": "src/index.ts",
"compatibility_date": "2024-10-22",
"compatibility_flags": [
"python_workflows",
"python_workers"
],
"workflows": [
{
"name": "workflows-starter",
"binding": "MY_WORKFLOW",
"class_name": "MyWorkflow"
}
]
}

And this is how you use the payload in your workflow:

from pyodide.ffi import to_js
class DemoWorkflowClass(WorkflowEntrypoint):
async def run(self, event, step):
@step.do('step-name')
async def first_step():
payload = event["payload"]
return payload

Workflow

The Workflow binding gives you access to the Workflow class. All its methods are available on the binding.

Under the hood, the Workflow binding is a Javascript object that is exposed to the Python script via JsProxy. This means that the values returned by its methods are also JsProxy objects, and need to be converted back into Python objects using python_from_rpc.

create

Create (trigger) a new instance of a given Workflow.

  • create(options=None)
    • options - an optional dictionary of options to pass to the workflow instance. Should contain the same keys as the WorkflowInstanceCreateOptions type.
from pyodide.ffi import to_js
async def on_fetch(request, env, ctx):
event = {"foo": "bar"}
options = to_js({"params": event}, dict_converter=Object.fromEntries)
await env.MY_WORKFLOW.create(options)
return Response.json({"status": "success"})

The create method returns a WorkflowInstance object, which can be used to query the status of the workflow instance. Note that this is a Javascript object, and not a Python object.

create_batch

Create (trigger) a batch of new workflow instances, up to 100 instances at a time. This is useful if you need to create multiple instances at once within the instance creation limit.

  • create_batch(batch)
    • batch - list of WorkflowInstanceCreateOptions to pass when creating an instance, including a user-provided ID and payload parameters.

Each element of the batch list is expected to include both id and params properties:

from pyodide.ffi import to_js
# Create a new batch of 3 Workflow instances, each with its own ID and pass params to the Workflow instances
listOfInstances = [
to_js({ "id": "id-abc123", "params": { "hello": "world-0" } }, dict_converter=Object.fromEntries),
to_js({ "id": "id-def456", "params": { "hello": "world-1" } }, dict_converter=Object.fromEntries),
to_js({ "id": "id-ghi789", "params": { "hello": "world-2" } }, dict_converter=Object.fromEntries)
];
await env.MY_WORKFLOW.create_batch(listOfInstances);

get

Get a workflow instance by ID.

  • get(id)
    • id - the ID of the workflow instance to get.

Returns a WorkflowInstance object, which can be used to query the status of the workflow instance.

instance = await env.MY_WORKFLOW.get("abc-123")
# FFI methods available for WorkflowInstance
await instance.status()
await instance.pause()
await instance.resume()
await instance.restart()
await instance.terminate()

send_event

Send an event to a workflow instance.

  • send_event(options)
    • type - the type of event to send to the workflow instance.
    • payload - the payload to send to the workflow instance.
from pyodide.ffi import to_js
await env.MY_WORKFLOW.send_event(to_js({ "type": "my-event-type", "payload": { "foo": "bar" } }, dict_converter=Object.fromEntries))

REST API (HTTP)

Refer to the Workflows REST API documentation.

Command line (CLI)

Refer to the CLI quick start to learn more about how to manage and trigger Workflows via the command-line.