Is there any place to watch out for when beta models go out of beta
Is there any place to watch out for when beta models go out of beta
Beta on this page are free until they come out of beta. https://developers.cloudflare.com/workers-ai/models/#text-to-image@hf/thebloke/llamaguard-7b-awq there is a bit outdated, still uses llama 2 and it's not very smart, what about upgrading it to https://huggingface.co/meta-llama/Llama-Guard-3-8B which uses llama 3.1? And while at it perhaps also add https://huggingface.co/meta-llama/Prompt-Guard-86M 
ai.run() request?
TypeError: The ReadableStream has been locked to a reader. prevent updateRecentImages workstop_p or repetition_penalty value to 0 causes Cloudflare API Errordata: [DONE] event, 500 "Cloudflare API error" top_p & repetition_penalty is 0, so ideally we shouldn't get any error..Beta@hf/thebloke/llamaguard-7b-awqai.run()const response = await env.AI.run('@cf/meta/llama-2-7b-chat-int8', {
prompt: "tell me a joke about cloudflare";
});TypeError: The ReadableStream has been locked to a reader.top_ptop_prepetition_penaltyrepetition_penaltyCloudflare API Errordata: [DONE]500 "Cloudflare API error"const response = await ai.run(model || "@cf/stabilityai/stable-diffusion-xl-base-1.0", requestInput);
// Store the image name in R2, in background
ctx.waitUntil(updateRecentImages(imageName, (await generateThumbs(imageName, '400x', true)), input, env, response));
return new Response(response, {
headers: {
"content-type": "image/png",
},
});{
"errors": [
{
"message": "Server Error",
"code": 6001
}
],
"success": false,
"result": {},
"messages": []
}