Ash ai prompt wrapper
Moving the question here from the ash development channel:
Boris - BackedBy — 08:13
Every time I submit an LLM request, I also submit evaluation criteria. After receiving the main response, I have a validation model respond with 'true' or 'false' re whether the response fits the evaluation criteria. The logprob of the 'true' token is then used as a numerical score for how well the LLM responded. What would be the best approach for extending the ash ai prompt-backed action to have optional criteria: [list of strings] and validation_model: string definitions to run the validation step?
Zach Daniel — 08:15
As it stands right now, your best bet would likely be to create a "wrapper" around the prompt action implementation
Or even just copy it and use its implementation, as its not a huge thing really
If there is an example of a wrapper around a the prompt action available for me to take a look at that would be very helpful. Thank you!
1 Reply
I don't think there is an example, but what you would essentially do is call the
.run
function of Ash AI's prompt action
although, TBH, I might start by just copying our prompt action
Its possible we could add hooks/callbacks for you