Thanks are you even a cloudflare developer? ๐ (Sorry for accusing you or anything)
Thanks are you even a cloudflare developer?
(Sorry for accusing you or anything)

source_lang acts more as a suggestion than a rule. Not sure if this is a Cloudflare or a Whisper thing, but figured I'd ask. Relevant snippet: @hf/meta-llama/meta-llama-3-8b-instruct to intermittently return blank responses?atob() on it to convert the string to binaryBuffer.from(base64Image, 'base64')
Uint8Array.from(binaryString, (m) => m.codePointAt(0));source_langconst inputs = {
audio: [...new Uint8Array(arrayBuffer)],
source_lang: 'tr',
target_lang: 'tr'
};
const transcriptionResponse = await env.AI.run('@cf/openai/whisper', inputs);@hf/meta-llama/meta-llama-3-8b-instructatob()Buffer.from(base64Image, 'base64')Uint8Array.from(binaryString, (m) => m.codePointAt(0)); const response = await config.ai.run('@cf/black-forest-labs/flux-1-schnell',
{
prompt,
steps: DEFAULT_STEPS
},
{
gateway: { id: 'misu-ai' },
extraHeaders: {
Authorization: `Bearer ${config.aiGateway.apiToken}`,
'cf-aig-authorization': `Bearer ${config.aiGateway.apiToken}`,
'cf-aig-token': `Bearer ${config.aiGateway.apiToken}`
}
}
)const embedding = await env.AI.run('@cf/baai/bge-base-en-v1.5', {
text: query
});
const values = Array.from(embedding.data[0]);
const results = await vectorIndex.query({
topK: 5,
values: values
});