AshAI json schema usage
Any examples passing in json_schema to:
I keep getting
I tested the same schema in openai playground and it has worked fine
83 Replies
We are almost certiainly overriding this:
json_schema: JobListingSchema.schema()
with the empty 'map' schema
To use structured outputs, you have to tell it exactly what the fields are, so you'd need to put in like a custom map type w/ use Ash.Type.NewType, subtype_of: :map
(or use the new TypedStruct
thing that I released like 10 minutes ago)
Alternatively you could use a different adapter, like RequestJson
how would StructuredOutput look like in this example?
What is this:
JobListingSchema.schema()
?
I think maybe you're missing a piece here which is how much Ash will do for you automatically
Just a module to read in json schema file and convert it to a map:
Probably unnecessary
Ah oh so in infers the schema from the return type
which you updated from :map to the new Ash.TypedStruct
Yep!
Ok so im thinking of adding this to a types directory in the related domain
Yeah
/domain/types/foo.ex
is a normal pattern👍 awesome, yeah not many examples for LLMs to explain this lol
updating docs & usage-rules.md
awesome time to write out that TypedStruct
One thing I noticed is I cant set allow_nil? on a field:
In my original json schema I normall set it to two possible types:
But I can still list it under required:
That is a limitation of structured outputs in open ai AFAIK
on the Ash/langchain side of things? Because from my understanding, required only defines what fields have to be present in the response, regardless of whether it is null or not
Or at least thats how it was working in practice for me
Ill take a look at https://github.com/ash-project/ash_ai/blob/4678d6868f828c857b4e34c76ad3a71c571d27f3/lib/ash_ai/actions/prompt/adapter/structured_output.ex#L1
GitHub
ash_ai/lib/ash_ai/actions/prompt/adapter/structured_output.ex at 46...
Structured outputs, vectorization and tool calling for your Ash application - ash-project/ash_ai
Maybe it is related to
defp add_null_for_non_required(%{required: required} = schema)
: https://github.com/ash-project/ash_ai/blob/4678d6868f828c857b4e34c76ad3a71c571d27f3/lib/ash_ai/open_api.ex#L148GitHub
ash_ai/lib/ash_ai/open_api.ex at 4678d6868f828c857b4e34c76ad3a71c57...
Structured outputs, vectorization and tool calling for your Ash application - ash-project/ash_ai
Would there be a way to see the schema that gets passed to the LLM?
There isn't currently but we could make one
Could be as simple as not overwriting a non-nil schema
Right, we should change our json schema implementation to show all fields as required always
but to allow them to have null values
I can try to take a look, I am having trouble understanding where the TypedStruct gets referenced and then converted to a json schema
AI:
yeah so it would be in
AshAi.OpenApi
hmmm
🤔 that doesn't seem right to me
This is giving me a bit more info:
Its probably:
Yep. That can just but all fields
required: Keyword.keys(constraints[:fields])
Its working now:
I think its also the LLM ignoring setting field to 0 rather than null in the response
🤷♂️ nothing we can do about that part 😆
you can prompt it not to w/ field descriptions
I do 😢
I even tested it:
haha
Try thsi then
Just to recap, I changed the following in openapi:
```
required:
constraints[:fields]
- |> Enum.filter(fn {, config} -> !config[:allownil?] end)
+ # |> Enum.filter(fn {, config} -> !config[:allow_nil?] end)
|> Enum.map(&elem(&1, 0))
and
if fields == [] do
nil
else
%{
type: :object,
properties: Map.new(fields),
additionalProperties: false,
- # required: required Missing?
+ required: Keyword.keys(constraints[:fields])
}
|> with_attribute_description(attribute_or_aggregate)
```
I think thats fine
It should tell the agent that and it can call it again right?
Oh, right nvm.
Yeah i guess if it was agent that would be useful as tool call
Strange though
Well its returning 0
Maybe the description isn't making it in?
I have suspicion the description is not being sent
Yes
I have processed 50k listings before in my previous phoenix app and never hit this issue, it was very smart
You can check by converting your type to json schema
Its not when checking the request:
That is one way to do it 😄
Looking into it
can you PR your required fields fix?
AI:
I'm working on it, but please keep in mind that if you are responding w/ AI generated content it must be clearly marked as such
Sorry I put it in backticks but I should be more clear
Im "discussing/exploring" ash_ai with amp
Ok updated my previous comments to reflect that
Thanks, its no big deal just letting you know
Looking into the issue
@Shaba please use the latest version of ash core, I just published a fix.
Forked ash_ai and setup postgres with pgvector and ran tests everything is working fine
But making those changes breaks a bunch of tests
I can take a closer look tmm
Let me test your changes
Go ahead and open a PR and I can take a look, even w/ the failing tests
GitHub
fix: update nil-typed struct handling in OpenAPI schema generation ...
Warning tests are broken @zachdaniel
Remove filter excluding allow_nil? fields from required list
Properly set required fields based on constraints[:fields] keys
This ensures nullable fields are ...
Why is this file named OpenAPI?
its a relic 😄
We extracted it from
AshJsonApi
which was building open api specs
We're mid-process of reworking it to be a better fit for Ash AIIll see if I can contribute some dev setup for this? There arent really any instructions regarding the DB and the extension needed if im not mistaken
I reused my devenv config but I think this relies on asdf
That would be wonderful 😄
Thank you, confirmed your update:
🥳
I feel like there is some default conversion from null to whatever nullish value is for the type

Testing in the openai playground im getting the expected result
I don't think there are
We do
""
-> null
by defaultHmmmm even the cheapest nano model is still returning null in playground

Sorry, what is the issue specifically?
My LLM json responses are including null when making them in OpenAI playground but when using AshAI they are set to nullish values
Do you have defaults set?
no
🤔 add
verbose?: true
option to your promtpt action opts
so you can see all the requests and responses to see whats up
Super strange
I see the problem
the type is only defining 1 type
if allow_nil? is true in the TypedStruct, then we need the type to also be updated to:
The LLM is being instructed to return that type
and I guess for raw_salary, since its a string, its returning "" which gets auto converted to null in Ash
Probably has to do with
add_null_for_non_required
Well I guess not to do with but we are only adding null for non required
Im thinking this would make more sense as add_null_for_allowed_nil
or whatever name
Oh yeahhhh
Its pretty hacky and doesnt take into account if the field is allow_nil? but essentially:
Strange, its supposed to do that by default
Oh, I see.
We need to add null for non required before marking everying as required effectively
yeah, okay
Im noticing the constraint is also not being represented in the schema
might be a rabbit hole to implement this
Trying to enumerate all possible constraints but idk how they map to Ash Type contraints
ie.
The LLM seems to respect it:
I think its a good idea to implement the core constraints like that 👍
So im assuming this probably needs to be updated:
Maybe not since thats the based schema
Honestly the file is pretty confusing for me
GitHub
GitHub - jonasschmidt/ex_json_schema: An Elixir JSON Schema validator
An Elixir JSON Schema validator. Contribute to jonasschmidt/ex_json_schema development by creating an account on GitHub.
GitHub
GitHub - lud/jsv: Json Schema Validator for Elixir with full spec s...
Json Schema Validator for Elixir with full spec support - lud/jsv
I don't think so. It should be pretty straightforward to pull the constraints out in those function heads and use them to build the apppropriate schema
If you open an issue I can look into it 🙂
Sort of like this? AI generated:
Yep, pretty much
not sure if thats what json schema expects
but otherwise yes
Closed the PR since its bit of mess but I pushed another change to branch where I had let the LLM run for a bit to see how far it can get. Its a bit of a mess however I think there might be some useful stuff to pull from it
GitHub
fix: update nil-typed struct handling in OpenAPI schema generation ...
Warning tests are broken @zachdaniel
Remove filter excluding allow_nil? fields from required list
Properly set required fields based on constraints[:fields] keys
This ensures nullable fields are ...
I can open up an issue and also reference the PR, but if you interested and/or willing I am available/interested to pair on this
I wouldn't have time to pair unfortunately, either an issue describing what the problem is or a PR w/ a (vetted) fix would need to be the next steps here.
Cool I added an issue: https://github.com/ash-project/ash_ai/issues/95
GitHub
Incomplete JSON Schema Generation · Issue #95 · ash-project/ash_ai
Code of Conduct I agree to follow this project's Code of Conduct AI Policy I agree to follow this project's AI Policy, or I agree that AI was not used while creating this issue. Versions el...
@Zach Daniel Thanks for the implementing this ❤️
Taking a look at your PR, I was overcomplicating my approach to this. But yesterday when testing with Gemini, my valid json schema for openai was not valid for gemini. I am guessing these transformations may end up needing to become vendor specific
Obviously something to worry about if actually becomes an issue
Yeah, gemini has different requirements
we need to adapt that code to take options that will explain what to do
And when using Gemini we can pass different options down
GitHub
GitHub - piotrmaciejbednarski/structllm: Universal Python library f...
Universal Python library for Structured Outputs with any LLM provider - piotrmaciejbednarski/structllm