I am trying to better understand Tools within the context of LLMs.
I know what they do, thats not hard. BUT I am having issues finding the proper documentation for OLLAMA for them.
There isnt a formal workflow, so it seems after you get info from your model, you will then execute all your tools that the model thinks it needs, and then you resubmit them to the model to process.
The problem i come into is: Is there a format for the tool content?
When I push the data back up to the server it does not seem to understand it. Ex: I add the isoFormatted string to content and reanalyzed the chat, but no Formal response. Similarly, no matter WHAT I am querying now, the LLM wants my datetool. So i dont understand if there is a property Im missing.
My api docs I have been going off of is: https://www.postman.com/postman-student-programs/ollama-api/overview
Which has given me better insight than the github docs it seems.
Can someone help me understand this? I have been trying to find repo that use tools but i havent been too successful.
Note: I am doing this with next, so I am running all of this from inside
api/chat/router.ts
in effort to seamlessly intergrate into my test app.
Long shot, but maybe @theo (t3.gg) could do a proper how-to on how to do a tool or agent? I think it would be really popular, and given T3Chat, he could easily write some easy demo.Postman
Ollama API
Ollama API on the Postman API Network: This public workspace features ready-to-use APIs, Collections, and more from Postman Student Programs.
1 Reply
I ended up solving the problem. The issue was the follow-on request formatting for the tool. Once i got that set up properly then, it smoothed itself out.
After getting the second request completed, I used the StreamReacher middleware to then append the chunks into the 1 data-stream.