Outcome File

Report structured evidence from a worker handler via $CUEAPI_OUTCOME_FILE.

What it is

On every execution, the worker daemon creates a per-run temp file and exposes its path as $CUEAPI_OUTCOME_FILE in the handler's environment. A handler that wants to report structured evidence — external_id, result_url, summary, etc. — writes a single JSON object to that path before exiting. The daemon reads the file after the subprocess returns, validates it against a fixed schema, resolves any conflict with the exit code, and forwards the evidence to POST /v1/executions/{id}/outcome as part of its automatic report.

If the handler doesn't want this — or doesn't know about the file — it just exits, and the daemon reports based on the exit code alone. No opt-in flag required.

Requires cueapi-worker 0.3.0 or later.

Why it exists

Webhook transport handlers can attach evidence directly in the body of POST /v1/executions/{id}/outcome. Worker transport handlers historically couldn't — the worker daemon was the one writing the outcome, and the handler had no channel to pass evidence through. That meant verification_mode = require_external_id (and friends) only worked on webhook transport.

$CUEAPI_OUTCOME_FILE closes that gap. Every verification mode is now supported on both transports.

Usage — minimal

bash
#!/bin/bash
# Handler that did some work and wants to attach a result URL.
cat > "$CUEAPI_OUTCOME_FILE" <<EOF
{
  "success": true,
  "result_url": "https://github.com/org/repo/pull/42",
  "summary": "Opened PR #42 with the proposed refactor"
}
EOF
python
# Same idea in Python.
import json
import os
 
with open(os.environ["CUEAPI_OUTCOME_FILE"], "w") as f:
    json.dump({
        "success": True,
        "external_id": "tw_1234567890",
        "result_url": "https://twitter.com/user/status/1234567890",
        "summary": "Posted the scheduled tweet",
    }, f)

No imports, no SDK, no network call from the handler — just write JSON to a path the worker already set up for you.

Schema

All fields optional. Unknown keys are silently dropped with a warning log; type-mismatched fields are dropped; the file as a whole is not rejected for a single bad field.

FieldTypeMax lengthNotes
successboolOverrides exit-code success on exit 0 (see conflict rules). Ignored on crash.
errorstring2000Failure message.
resultstring2000Short success string.
external_idstring500ID from the downstream system (PR number, Stripe ID, message ID).
result_urlstring2000Must start with http:// or https://.
result_refstring500Internal reference.
result_typestring100Free-form category, e.g. "tweet", "pr", "stripe_payment".
summarystring500Short human summary.
artifactslistStructured artifact entries.
metadatadictUser metadata merged into the report.

File size cap: 10 KB. Larger files are rejected and the run falls back to exit-code-only.

Conflict rules

The daemon resolves any mismatch between the subprocess exit code and the file contents using these rules:

Exit codeFile successReported successReason
0absenttrueExit 0 = success by default
0truetrueAgreement
0falsefalseFile wins — handler reported semantic failure
≠ 0anyfalseExit code wins — the process crashed
timeoutanyfalseTimeout wins — file contents not trusted

Evidence fields (external_id, result_url, result_ref, result_type, summary, artifacts) are always merged into the report when present, regardless of which side wins success.

Diagnostics

When the file is present-but-rejected (oversize, parse error, dropped fields), the daemon forwards a breadcrumb under a reserved metadata key so you can debug from the execution record:

json
{
  "_cueapi_worker": {
    "outcome_file_parse_error": "json.decoder.JSONDecodeError: ...",
    "outcome_file_dropped_fields": ["result_url", "some_unknown_key"]
  }
}

Full example: verified GitHub PR cue

Cue:

bash
cueapi create \
  --name "nightly-refactor" \
  --cron "0 2 * * *" \
  --transport worker \
  --verification-mode require_external_id \
  --payload '{"task": "refactor-pr"}'

Worker handler config:

yaml
handlers:
  refactor-pr:
    cmd: "python3 refactor.py"
    timeout: 1800
    env:
      ANTHROPIC_API_KEY: "${ANTHROPIC_API_KEY}"
      GITHUB_TOKEN: "${GITHUB_TOKEN}"

Handler refactor.py:

python
import json
import os
import subprocess
 
# ... agent does its thing, opens a PR ...
pr_number = "42"
pr_url = f"https://github.com/org/repo/pull/{pr_number}"
 
with open(os.environ["CUEAPI_OUTCOME_FILE"], "w") as f:
    json.dump({
        "success": True,
        "external_id": pr_number,
        "result_url": pr_url,
        "result_type": "github_pr",
        "summary": "Refactored auth module, 247 lines changed",
    }, f)

The cue's require_external_id policy is satisfied because external_id is present. Outcome state lands at verified_success.

How do I know if my agent ran successfully?
Ctrl+K