uh oh that means we get less than 10k a day for months with more than 30 days? xD
uh oh that means we get less than 10k a day for months with more than 30 days? xD

adapter_model.safetensors to a created finetune and got error, see thread


llama-3-8b-instruct with the REST API. It works somewhat but it sometimes feels like it's trying to chat, remembering context from previous HTTP calls. Has anyone ran into similar issues? I've been refining the prompts using system, user and assistant roles.gemma-7b-it finetune at all ('file' should be of valid safetensors type), and I can successfully upload/run inference for mistral/mistral-7b-instruct-v0.2-lora, but the results are worse than the vanilla modelq_proj and v_projmistralai/mistral-7b-instruct-v0.2 using the AutoTrain_LLM notebook , then cURL to https://api.cloudflare.com/client/v4/accounts/${account_id}/ai/run/@cf/mistral/mistral-7b-instruct-v0.2-lora🚧 Creating index: 'ers-v1'
✘ [ERROR] A request to the Cloudflare API (/accounts/53e0c2270158721ff328e572f56950ea/vectorize/indexes) failed.
vectorize.not_entitled [code: 1005]
If you think this is a bug, please open an issue at:
https://github.com/cloudflare/workers-sdk/issues/new/choose const systemContent = `You are a knowledgeable employee familiar with the company ${companyName}, responding to customer inquiries. Follow these guidelines:
- Answer in the same language as the question.
- Do not reveal your identity.
- If you don't know the answer, admit it without making anything up.
- Maintain a neutral tone.
- Do not provide opinions or personal views.
- Avoid asking for feedback.
- Keep the conversation strictly to the point; do not engage in small talk or recommendations.
- Do not apologize.
- Do not initiate or continue small talk.
- Do not use phrases like "I'm sorry" or "I apologize."`;
await got.post(`https://api.cloudflare.com/client/v4/accounts/${Env.CLOUDFLARE_ACCOUNT_ID}/ai/run/${model}`, {
headers: { Authorization: `Bearer ${Env.CLOUDFLARE_WORKERS_AI_KEY}` },
json: {
max_tokens: 350,
messages: [
{ role: 'system', content: systemContent },
{ role: 'user', content: `Question:${question}` },
{ role: 'assistant', content: context }
],
temperature: 0.5
}
});