I tried creating the following resource to build a simple lookup prompt action. The idea is that someone can put some search term into it and have the LLM look up the NPI registry information using a Google search. The LangChain docs show a usage example here and I tried recreating it with an Ash resource here:
defmodule Prism.NPI.Lookup do use Ash.Resource, extensions: [AshAi], domain: Prism.NPI actions do action :find_person, Prism.NPI.Person do description "Look up a person's NPI information for a given query phrase (intended to be a name)" argument :query, :string do allow_nil? false end run prompt( LangChain.ChatModels.ChatGoogleAI.new!(%{model: "gemini-2.0-flash"}), tools: [ LangChain.NativeTool.new!(%{name: "google_search", configuration: %{}}) ], prompt: """ Search for NPPES NPI results for the following physician: <%= @input.arguments.query %> NPPES NPI information is primarily found on the following website: https://npiregistry.cms.hhs.gov """ ) end endend
defmodule Prism.NPI.Lookup do use Ash.Resource, extensions: [AshAi], domain: Prism.NPI actions do action :find_person, Prism.NPI.Person do description "Look up a person's NPI information for a given query phrase (intended to be a name)" argument :query, :string do allow_nil? false end run prompt( LangChain.ChatModels.ChatGoogleAI.new!(%{model: "gemini-2.0-flash"}), tools: [ LangChain.NativeTool.new!(%{name: "google_search", configuration: %{}}) ], prompt: """ Search for NPPES NPI results for the following physician: <%= @input.arguments.query %> NPPES NPI information is primarily found on the following website: https://npiregistry.cms.hhs.gov """ ) end endend
This seems to fail the Spark validation here. Is there a way we can support something like this or am I just missing how to do it? I'd be happy to work on adding it but I'm not sure if this is something you'd want to support and I'm not super familiar with defining Spark schemas (though I'd be happy to try!).
Structured outputs, vectorization and tool calling for your Ash application - ash-project/ash_ai
Solution
It's slightly more verbose but converting it to a generic action wasn't so bad:
defmodule Prism.NPI.Lookup do use Ash.Resource, extensions: [AshAi], domain: Prism.NPI alias LangChain.Chains.LLMChain alias LangChain.Message alias LangChain.NativeTool alias LangChain.ChatModels.ChatGoogleAI alias LangChain.MessageProcessors.JsonProcessor actions do action :find_person, Prism.NPI.Person do description "Look up a person's NPI information for a given query phrase (intended to be a name)" argument :query, :string do allow_nil? false end run fn input, _context -> model = ChatGoogleAI.new!(%{temperature: 0, stream: false, model: "gemini-2.0-flash"}) query = input.arguments.query {:ok, updated_chain} = %{llm: model, verbose: false, stream: false} |> LLMChain.new!() |> LLMChain.add_message( Message.new_user!(""" Search for NPPES NPI results for the following physician: #{query} NPPES NPI information is primarily found on the following website: https://npiregistry.cms.hhs.gov IMPORTANT INSTRUCTIONS: You MUST respond with valid JSON that matches the following schema: ``json { "id": "1234567890", "last_name": "DOE", "first_name": "JOHN", "primary_fax_number": "+14075981501", "taxonomy": ["taxonomy"] } `` """) ) |> LLMChain.add_tools(NativeTool.new!(%{name: "google_search", configuration: %{}})) |> LLMChain.message_processors([JsonProcessor.new!(~r/``json(.*?)``/s)]) |> LLMChain.run() case updated_chain do %{last_message: %{processed_content: json}} -> Ash.Type.cast_input(Prism.NPI.Person, json) _ -> {:ok, nil} end end end endend
defmodule Prism.NPI.Lookup do use Ash.Resource, extensions: [AshAi], domain: Prism.NPI alias LangChain.Chains.LLMChain alias LangChain.Message alias LangChain.NativeTool alias LangChain.ChatModels.ChatGoogleAI alias LangChain.MessageProcessors.JsonProcessor actions do action :find_person, Prism.NPI.Person do description "Look up a person's NPI information for a given query phrase (intended to be a name)" argument :query, :string do allow_nil? false end run fn input, _context -> model = ChatGoogleAI.new!(%{temperature: 0, stream: false, model: "gemini-2.0-flash"}) query = input.arguments.query {:ok, updated_chain} = %{llm: model, verbose: false, stream: false} |> LLMChain.new!() |> LLMChain.add_message( Message.new_user!(""" Search for NPPES NPI results for the following physician: #{query} NPPES NPI information is primarily found on the following website: https://npiregistry.cms.hhs.gov IMPORTANT INSTRUCTIONS: You MUST respond with valid JSON that matches the following schema: ``json { "id": "1234567890", "last_name": "DOE", "first_name": "JOHN", "primary_fax_number": "+14075981501", "taxonomy": ["taxonomy"] } `` """) ) |> LLMChain.add_tools(NativeTool.new!(%{name: "google_search", configuration: %{}})) |> LLMChain.message_processors([JsonProcessor.new!(~r/``json(.*?)``/s)]) |> LLMChain.run() case updated_chain do %{last_message: %{processed_content: json}} -> Ash.Type.cast_input(Prism.NPI.Person, json) _ -> {:ok, nil} end end end endend
The Elixir backend framework for unparalleled productivity. Declarative tools that let you stop wasting time. Use with Phoenix LiveView or build APIs in minutes for your front-end of choice.