max_tokens defaults to 256, you’re probably hitting that many output tokens so it’ll stop
max_tokens defaults to 256, you’re probably hitting that many output tokens so it’ll stop

system role from llama-guard-3-8b since it wasn't intended to be allowed but it seems they forgot to update the playgroundmax_tokens yourself.workerd/server/workerd-api.c++:759: error: wrapped binding module can't be resolved (internal modules only); moduleName = miniflare-internal:wrapped:__WRANGLER_EXTERNAL_AI_WORKER
workerd/jsg/util.c++:331: error: e = workerd/server/workerd-api.c++:789: failed: expected !value.IsEmpty(); global did not produce v8::Value