qq; is there a limit on the response length from the LLM?
qq; is there a limit on the response length from the LLM?
json_format response with workers AI? (like what OpenAI supports) { question: "string", answer: "string" }{question: "what is the best dog", answer: "all of them"} (made this up 
p. What is that?

curl -I https://notexcluded.domain.com




max_tokens on runWithTools?json_format{ question: "string", answer: "string" }{question: "what is the best dog", answer: "all of them"}pname = "security-headers-prod-us"
main = "index.py"
compatibility_date = "2024-03-20"
account_id = "your-account-id"
route = notexcluded.domain.com/*"
zone_id = "domain.com"from js import Response, fetch
async def on_fetch(request):
# List of excluded domains
excluded_domains = [
"example1.domain.com",
"example2.comain.com",
"...",
"example3.domain.com"
]
# Get the host from the request URL
url = request.url
hostname = url.split("/")[2]
# Check if the hostname should be excluded
if any(hostname.endswith(domain) for domain in excluded_domains):
response = await fetch(request)
else:
response = await fetch(request)
new_response = Response.new(response.body, response)
new_response.headers.append(
"x-workers-hello",
"Hello from Cloudflare Workers"
)
new_response.headers.set("x-frame-options", "SAMEORIGIN")
new_response.headers.set("x-content-type-options", "nosniff")
new_response.headers.set("referrer-policy", "strict-origin-when-cross-origin")
new_response.headers.set("content-security-policy", "object-src 'none' blob:; base-uri 'self'; report-uri https://cspappdirect.report-uri.com/r/d/csp/enforce; worker-src 'self' blob:")
response = new_response
return new_responsecurl -I https://notexcluded.domain.comFILE_PARSE_ERROR: 'file' should be of valid safetensors type [code: 1000], quiting...npx wrangler ai finetune create @cf/mistral/mistral-7b-instruct-v0.2-lora warp-finetune-mistral folder_location--- 2024-07-03T18:20:15.757Z debug
🪵 Writing logs to "/Users/agdfdf/.wrangler/logs/wrangler-2024-07-03_18-20-15_678.log"
---
--- 2024-07-03T18:20:15.757Z debug
Failed to load .env file ".env": Error: ENOENT: no such file or directory, open '.env'
at Object.openSync (node:fs:582:18)
at Object.readFileSync (node:fs:461:35)
at tryLoadDotEnv (/Users/ad/node_modules/wrangler/wrangler-dist/cli.js:159106:72)
at loadDotEnv (/Users/ad/node_modules/wrangler/wrangler-dist/cli.js:159115:12)
at Array.reduce (<anonymous>) {
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: '.env'
}
---
--- 2024-07-03T18:20:15.764Z debug
Retrieving cached values for account from ../../../node_modules/.cache/wrangler
---
--- 2024-07-03T18:20:15.764Z log
🌀 Creating new finetune "warp-mistral-new" for model "@cf/mistral/mistral-7b-instruct-v0.2-lora"...
---
--- 2024-07-03T18:20:15.765Z debug
-- START CF API REQUEST: POST https://api.cloudflare.com/client/v4/accounts/<account_id_masqueraded>/ai/finetunes
---
--- 2024-07-03T18:20:15.765Z debug
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-07-03T18:20:15.765Z debug
INIT: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-07-03T18:20:15.765Z debug
-- END CF API REQUEST
---
--- 2024-07-03T18:20:17.189Z debug
-- START CF API RESPONSE: Bad Request 400
---
--- 2024-07-03T18:20:17.190Z debug
HEADERS: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-07-03T18:20:17.190Z debug
RESPONSE: omitted; set WRANGLER_LOG_SANITIZE=false to include sanitized data
---
--- 2024-07-03T18:20:17.190Z debug
-- END CF API RESPONSE
---
--- 2024-07-03T18:20:17.229Z error
✘ [ERROR] 🚨 Finetune couldn't be created: A request to the Cloudflare API (/accounts/<account_id_masqueraded>/ai/finetunes) failed. FINETUNE_EXISTS_ERROR: finetune with the name 'warp-mistral-new' already exists [code: 1000]max_tokensrunWithTools