Non-OpenAI support for prompt-backed-actions has been merged into Ash AI

Zach Daniel (@zachdaniel.dev)
πŸ€–Just merged support for non-OpenAI models in prompt-backed-actions (structured outputs) via adapters. The adapter is inferred from the model, and supplied via an option otherwise. Thanks to @brainlid.bsky.social for LangChain, it's been very useful for Ash AI. #AshFramework #ElixirLang
From Zach Daniel (@zachdaniel.dev)
Bluesky
No description
34 Replies
Matt Beanland
Matt Beanlandβ€’5mo ago
assert is_hype == false
assert is_hype == false
abeeshake456
abeeshake456β€’5mo ago
Why does it need Enumerable?
ZachDaniel
ZachDanielOPβ€’5mo ago
define analyze_sentiment, args: [:text]
abeeshake456
abeeshake456β€’5mo ago
Can we improve the error a little ? I have been bitten by not defining args in define. Will send PR Can you tell me module / relevant place where the validation/ error has to be popped Up?
ZachDaniel
ZachDanielOPβ€’5mo ago
Its in Ash.CodeInterface I think? Probably line 660:
params =
if is_list(params),
do: Enum.map(params, &Map.merge(&1, arg_params)),
else: Map.merge(params, arg_params)
params =
if is_list(params),
do: Enum.map(params, &Map.merge(&1, arg_params)),
else: Map.merge(params, arg_params)
We should just wrap that an a try/rescue I think?
abeeshake456
abeeshake456β€’5mo ago
Also, how to define my
run prompt(
LangChain.ChatModels.ChatOpenAI.new!(%{ model: "gpt-4o", api_key: System.get_env("OPENAI_API_KEY"), endpoint: System.get_env("OPENAI_ENDPOINT")}),
# LangChain.ChatModels.ChatAnthropic.new!(%{ model: "claude-3-5-haiku-latest"}),
# setting `tools: true` allows it to use all exposed tools in your app
tools: false
# alternatively you can restrict it to only a set of tools
# tools: [:list, :of, :tool, :names]
# provide an optional prompt, which is an EEx template
# prompt: "Analyze the sentiment of the following text: <%= @input.arguments.description %>"
)
end
run prompt(
LangChain.ChatModels.ChatOpenAI.new!(%{ model: "gpt-4o", api_key: System.get_env("OPENAI_API_KEY"), endpoint: System.get_env("OPENAI_ENDPOINT")}),
# LangChain.ChatModels.ChatAnthropic.new!(%{ model: "claude-3-5-haiku-latest"}),
# setting `tools: true` allows it to use all exposed tools in your app
tools: false
# alternatively you can restrict it to only a set of tools
# tools: [:list, :of, :tool, :names]
# provide an optional prompt, which is an EEx template
# prompt: "Analyze the sentiment of the following text: <%= @input.arguments.description %>"
)
end
LangChain.ChatModels.ChatOpenAI.new!(%{ model: "gpt-4o", api_key: System.get_env("OPENAI_API_KEY"), endpoint: System.get_env("OPENAI_ENDPOINT")}), defining this here does not work. How to run custom elixir code just before prompt? so that I can use these environment variables/.
ZachDaniel
ZachDanielOPβ€’5mo ago
you configure those things in your application config in runtime.exs
config :langchain, :openai_key, System.get_env("OPEN_AI_API_KEY")
config :langchain, :anthropic_key, System.fetch_env!("ANTHROPIC_API_KEY")
config :langchain, :openai_key, System.get_env("OPEN_AI_API_KEY")
config :langchain, :anthropic_key, System.fetch_env!("ANTHROPIC_API_KEY")
abeeshake456
abeeshake456β€’5mo ago
it is there. I want to config the endpoint as well.
ZachDaniel
ZachDanielOPβ€’5mo ago
How is that normally configured in langchain?
abeeshake456
abeeshake456β€’5mo ago
GitHub
langchain/lib/chat_models/chat_open_ai.ex at 7e0959d894aa229e474679...
Elixir implementation of a LangChain style framework that lets Elixir projects integrate with and leverage LLMs. - brainlid/langchain
ZachDaniel
ZachDanielOPβ€’5mo ago
run prompt(fn _, _ -> LangChain.ChatModels.ChatOpenAI.new!(%{ model: "gpt-4o", api_key: System.get_env("OPENAI_API_KEY"), endpoint: System.get_env("OPENAI_ENDPOINT")}) end, ...)
run prompt(fn _, _ -> LangChain.ChatModels.ChatOpenAI.new!(%{ model: "gpt-4o", api_key: System.get_env("OPENAI_API_KEY"), endpoint: System.get_env("OPENAI_ENDPOINT")}) end, ...)
abeeshake456
abeeshake456β€’5mo ago
tried this. This does not work.
run prompt(
fn _, _ ->
LangChain.ChatModels.ChatOpenAI.new!(%{
model: "gpt-4o",
api_key: System.get_env("OPENAI_API_KEY"),
endpoint: System.get_env("OPENAI_ENDPOINT")
})
end,
tools: false
)
run prompt(
fn _, _ ->
LangChain.ChatModels.ChatOpenAI.new!(%{
model: "gpt-4o",
api_key: System.get_env("OPENAI_API_KEY"),
endpoint: System.get_env("OPENAI_ENDPOINT")
})
end,
tools: false
)
error
[error] Trying to process an unexpected response. %{"error" => "Not Found", "message" => "Route POST:/v1 not found", "statusCode" => 404}
[error] Error during chat call. Reason: %LangChain.LangChainError{type: nil, message: "Unexpected response", original: nil}
[error] Trying to process an unexpected response. %{"error" => "Not Found", "message" => "Route POST:/v1 not found", "statusCode" => 404}
[error] Error during chat call. Reason: %LangChain.LangChainError{type: nil, message: "Unexpected response", original: nil}
ZachDaniel
ZachDanielOPβ€’5mo ago
What is your open api endpoint configured to be? It seems like a misconfiguration to me, you may want to start off just using langchain directly to figure out the options you're trying to set for example the issue seems unrelated to Ash's prompt action
abeeshake456
abeeshake456β€’5mo ago
Thanks. I had messed up environment variables. fn _, _ was a the trick I could not figure out myself though. Should I add that to usage_rules.md of ash_ai ?
ZachDaniel
ZachDanielOPβ€’5mo ago
Yes please πŸ˜„ and also the other docs somewhere fun fact, you can clone down ash_ai, paste our back and forth into claude and ask it to update the docs accordingly and you'll get a decent result πŸ˜„
abeeshake456
abeeshake456β€’5mo ago
I must say, with Ash, I always have this feeling that 'there is a way this works , i just don't know how' πŸ™‚ life saver!
ZachDaniel
ZachDanielOPβ€’5mo ago
Its the nature of the beast w/ a 0.1 package TBH Anyone using it is abit of a pioneer but as long as we focus on docs anytime anything is confusing, we can fix these problems
abeeshake456
abeeshake456β€’5mo ago
totally bullish there. https://github.com/ash-project/ash_ai/pull/48/files There you go. small PR.
ZachDaniel
ZachDanielOPβ€’5mo ago
Also, out of curiosity, what are you trying to do? Are you trying to use a different OpenAI compatible model? Because that other model likely does not support json schema responses You'll likely need to use the new completion tool adapter
abeeshake456
abeeshake456β€’5mo ago
Using kluster.ai for now, because they i got credits there. I will write a generic adapter which can extract json via LangChain Jsonprocessor and then coerce them to ash types. Without tool calls. High level goal - Writing dspy in elixir.
ZachDaniel
ZachDanielOPβ€’5mo ago
We should add a built in adapter for that thats the strategy I didn't add yet which is "ask nicely for JSON" πŸ˜†
abeeshake456
abeeshake456β€’5mo ago
Point me to the relevant place / file !
ZachDaniel
ZachDanielOPβ€’5mo ago
Basically just copying and modifying the CompletionTool adapter
abeeshake456
abeeshake456β€’5mo ago
Baml is a project which does this. They ask nicely for json, then coerce the json into their own types
ZachDaniel
ZachDanielOPβ€’5mo ago
Yeah so we have the infra, the adapters are pretty straightforward but if they support tool calls then CompletionTool is better to use typically
abeeshake456
abeeshake456β€’5mo ago
Agreed. Not everyone does. πŸ˜…
ZachDaniel
ZachDanielOPβ€’5mo ago
But we can add a fallback LLM to fallback to asking nicely for JSON even on the completion tool one That way if the agent doesn't call the tool w/in max runs we fallback to asking nicely for JSON and trying to parse it
abeeshake456
abeeshake456β€’5mo ago
How would that work ? A new FallbackTool which first uses CompletionTool and then NiceJsonTool (if previous fails)?
ZachDaniel
ZachDanielOPβ€’5mo ago
https://hexdocs.pm/langchain/0.3.3/LangChain.Chains.LLMChain.html#run_until_tool_used/3 either with with_fallbacks (something we may need to figure out how to integrate) or with just handling the error case and starting a new chain
abeeshake456
abeeshake456β€’5mo ago
I'll try the error and new chain for now. See how far can we go
ZachDaniel
ZachDanielOPβ€’5mo ago
I think your first thing would be the new adapter since you know your model doesn't support tools
abeeshake456
abeeshake456β€’5mo ago
Yup. 1. New adapter 2. Fallback adapter 3. Dspy example 1,3,2 in that order. New PR in ash for code_interface validation. https://github.com/ash-project/ash/pull/2102/files it even caught two missing in existing tests. πŸ™‚ posting it here because the discussion context for the bug was here. P.S.: I would need a little help from you regarding test cases. Claude wrote tests not sure how accurate that is.
ZachDaniel
ZachDanielOPβ€’5mo ago
Responded on PR

Did you find this page helpful?