Hey @Himanshu ! Function calling should

Hey @Himanshu ! Function calling should work with the gpt-oss models. Here's an example of that with our sandbox SDK: https://github.com/cloudflare/sandbox-sdk/blob/main/examples/code-interpreter/src/index.ts#L73-L91 We've also tested against the compatibilty suite here: https://github.com/openai/gpt-oss/tree/main/compatibility-test and are getting ~93% successes. Do you have an example of what you were trying that didn't work? We're also actively working on integrating support for the builtin code_interpreter tool to make some of this even easier.
8 Replies
Ana
Ana4w ago
This is interesting, because I’m getting the following running this example: AiError: AiError: No call message found for call_X Could you show me how your request input looks like for the final call (with the function_call and the function_call_input) This is my request:
{
"input": [
{
"role": "system",
"content": “You are a traveling assistant”
},
{
"role": "user",
"content": "What are my upcoming flights?”
},
{
"type": "function_call",
"call_id": "call_123”,
"name": "list_events”,
"arguments": "{\"type\”:\”flight\”}”
},
{
"type": "function_call_output",
"call_id": "call_123”,
"output": "[]"
}
]
}
{
"input": [
{
"role": "system",
"content": “You are a traveling assistant”
},
{
"role": "user",
"content": "What are my upcoming flights?”
},
{
"type": "function_call",
"call_id": "call_123”,
"name": "list_events”,
"arguments": "{\"type\”:\”flight\”}”
},
{
"type": "function_call_output",
"call_id": "call_123”,
"output": "[]"
}
]
}
The response:
{
"errors": [
{
"message": "AiError: AiError: No call message found for call_123 None (<uuid>)”,
"code": 3030
}
],
"success": false,
"result": {},
"messages": []
}
{
"errors": [
{
"message": "AiError: AiError: No call message found for call_123 None (<uuid>)”,
"code": 3030
}
],
"success": false,
"result": {},
"messages": []
}
Proman4713 🇵🇸
You need to add the assistant's tool call message It should have an ID If it does Then also take these IDs and put them in your call output messages
Ana
Ana3w ago
@Proman4713 🇵🇸 thanks for your reply. Sadly, still not working for me. Do you have an example you could give me of one of your requests?
samjs
samjsOP3w ago
Here's an example set of messages that should work:
{
"input": [
{
"role": "user",
"content": "What is the weather in London?"
},
{
"arguments": "{\"location\":\"London\"}",
"call_id": "call_290f406519144a6bac4e44f3cbaaa2e8",
"name": "get_weather",
"type": "function_call",
"id": "ft_290f406519144a6bac4e44f3cbaaa2e8",
"status": null
},
{
"type": "function_call_output",
"call_id": "call_290f406519144a6bac4e44f3cbaaa2e8",
"output": "{\"weather\":\"sunny\"}"
}
],
"tools": [
{
"type": "function",
"name": "get_weather",
"description": "Provides information about the weather in a specific location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get the weather for."
}
},
"required": [
"location"
]
}
}
]
}
{
"input": [
{
"role": "user",
"content": "What is the weather in London?"
},
{
"arguments": "{\"location\":\"London\"}",
"call_id": "call_290f406519144a6bac4e44f3cbaaa2e8",
"name": "get_weather",
"type": "function_call",
"id": "ft_290f406519144a6bac4e44f3cbaaa2e8",
"status": null
},
{
"type": "function_call_output",
"call_id": "call_290f406519144a6bac4e44f3cbaaa2e8",
"output": "{\"weather\":\"sunny\"}"
}
],
"tools": [
{
"type": "function",
"name": "get_weather",
"description": "Provides information about the weather in a specific location.",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The location to get the weather for."
}
},
"required": [
"location"
]
}
}
]
}
Note that you should basically pass in the tool call message exactly as you got it from the LLM. e.g. in the above example the "status": null parameter is actually important since the underlying server (vllm) needs it to figure out the message type.
Ana
Ana3w ago
So I was missing the status lol Thank you so much @samjs it worked 🙂
samjs
samjsOP3w ago
glad to hear it!
Ana
Ana3w ago
Is there a way we could make it more obvious to others? Not sure if this was a only-me problem
samjs
samjsOP3w ago
Yeah I was just going to look at the vllm repository to see if they've patched this. I remember we spent a while tracking down a similar issue in our internal tests too Definitely not just you

Did you find this page helpful?