Hey @Himanshu ! Function calling should
Hey @Himanshu ! Function calling should work with the gpt-oss models. Here's an example of that with our sandbox SDK: https://github.com/cloudflare/sandbox-sdk/blob/main/examples/code-interpreter/src/index.ts#L73-L91
We've also tested against the compatibilty suite here: https://github.com/openai/gpt-oss/tree/main/compatibility-test and are getting ~93% successes. Do you have an example of what you were trying that didn't work?
We're also actively working on integrating support for the builtin
code_interpreter
tool to make some of this even easier.8 Replies
This is interesting, because I’m getting the following running this example:
AiError: AiError: No call message found for call_X
Could you show me how your request input looks like for the final call (with the function_call
and the function_call_input
)
This is my request:
The response:
You need to add the assistant's tool call message
It should have an ID
If it does
Then also take these IDs and put them in your call output messages
@Proman4713 🇵🇸 thanks for your reply. Sadly, still not working for me. Do you have an example you could give me of one of your requests?
Here's an example set of messages that should work:
Note that you should basically pass in the tool call message exactly as you got it from the LLM. e.g. in the above example the
"status": null
parameter is actually important since the underlying server (vllm) needs it to figure out the message type.So I was missing the status lol
Thank you so much @samjs it worked 🙂
glad to hear it!
Is there a way we could make it more obvious to others? Not sure if this was a only-me problem
Yeah I was just going to look at the vllm repository to see if they've patched this. I remember we spent a while tracking down a similar issue in our internal tests too
Definitely not just you