Tail Consumer Limitations in `wrangler dev`

I'm building a tail consumer worker for observability/tracing and hit some limitations in wrangler dev that make local development challenging: Issues with tail() API in local dev: - item.scriptName is always null → service name shows "unknown-service" - item.wallTime and item.cpuTime are always 0 → duration shows 0ms (Spectre attack protection) - No way to identify which worker generated the trace tailStream() API: I can see tailStream() is implemented in workerd (src/workerd/api/tail-worker-test.wd-test) and works with the streaming_tail_worker compatibility flag, but wrangler dev explicitly ignores it (found in IGNORED_KEYS in workers-sdk). Question: Is there a timeline for tailStream() support in wrangler dev? The streaming API would solve the timing issues since event.timestamp would be populated even in local dev. For now I'm accepting these limitations in local dev (hoping production will work), but it significantly impacts the local development experience for observability tooling. Environment: wrangler 4.37.1, workerd compatibility date 2025-09-13
1 Reply
wondenge
wondengeOP2mo ago
Found a Workaround! We solved the local dev limitations without waiting for tailStream() support! The Solution Emit trace metadata via structured logging before Cloudflare sanitizes the TraceItem fields. Key insight: console.log() happens before field sanitization, and logs are preserved in TraceItem.logs[]. How It Works Source worker - wrap your fetch handler with middleware:
export default {
fetch: async (request, env, ctx) => {
const startTime = Date.now();
const response = await yourHandler(request, env, ctx);
const durationMs = Date.now() - startTime;

// Emit metadata as structured log
console.log({
msg: "trace.metadata",
serviceName: "my-worker",
durationMs,
method: request.method,
path: new URL(request.url).pathname,
status: response.status
});

return response;
}
}
export default {
fetch: async (request, env, ctx) => {
const startTime = Date.now();
const response = await yourHandler(request, env, ctx);
const durationMs = Date.now() - startTime;

// Emit metadata as structured log
console.log({
msg: "trace.metadata",
serviceName: "my-worker",
durationMs,
method: request.method,
path: new URL(request.url).pathname,
status: response.status
});

return response;
}
}
Tail consumer - extract from logs:
export async function tail(events: TraceItem[], env: Env) {
for (const item of events) {
// Find trace metadata in logs array
for (const log of item.logs) {
const parsed = Array.isArray(log.message) ? log.message[0] : log.message;

if (parsed?.msg === "trace.metadata") {
// parsed.serviceName → "my-worker" ✅
// parsed.durationMs → actual timing ✅
break;
}
}
}
}
export async function tail(events: TraceItem[], env: Env) {
for (const item of events) {
// Find trace metadata in logs array
for (const log of item.logs) {
const parsed = Array.isArray(log.message) ? log.message[0] : log.message;

if (parsed?.msg === "trace.metadata") {
// parsed.serviceName → "my-worker" ✅
// parsed.durationMs → actual timing ✅
break;
}
}
}
}
Results Before:
Service: "unknown-service"
Duration: 0ms
Service: "unknown-service"
Duration: 0ms
After:
Service: "my-worker"
Duration: 2ms
Service: "my-worker"
Duration: 2ms
Works in both local dev AND production! Note: Make sure to use structured logging (output objects, not JSON strings) so they preserve through console.log(). Also, wrangler dev wraps console.log objects in arrays, so check Array.isArray(log.message). Sharing in case anyone else hits the same issue!

Did you find this page helpful?