AE
Ash Elixir•2y ago
kyle

Spark.Dsl.Entity is option explicit or inferred?

I'm trying to figure out if there is a way to determine if an option was explicitly set or inferred. As an example, say I want to write a Transformer that marks every attribute as sensitive unless it has been explicitly set as non-sensitive (not quite my usecase but close). I've found where Entity.build receives that info as opts, and I can get and replace the attribute with the Transformer helpers, but I'm not seeing any way to determine if it was explicitly set in the Transformer. Is there a built-in way to determine what has been explicitly set? If not, any ideas for easily grabbing that info?
11 Replies
ZachDaniel
ZachDaniel•2y ago
We don't track that, unfortunately.
kyle
kyleOP•2y ago
😔 maybe someday? seems like it would be useful for overriding default behaviour. hacked this together for now, definitely not complete code (need to check that the path is valid and not accidentally a deep match) but works for me, for now, and sharing is caring so: def is_set?(dsl_state, path, key) do {:ok, ast} = dsl_state.persist.file |> File.read!() |> Code.string_toquoted() pre = fn {head, meta, [^key | args]}, {[head], } -> {[], true} {form, meta, [^key | args]}, {[], } -> {[], true} {head, meta, args}, {[head | left], right} -> {{head, meta, args}, {left, [head | right]}} ast, acc -> {ast, acc} end post = fn {head, meta, args}, {left, [head | right]} -> {{head, meta, args}, {[head | left], right}} ast, acc -> {ast, acc} end {, found?} = Macro.traverse(ast, {path, []}, pre, post) found? == true end also, not seeing how to mark as solved in the discord app, sorry for leaving open atm
ZachDaniel
ZachDaniel•2y ago
thats....pretty wild 😆 whats the specific thing you're looking to do out of curiosity? We could potentially track things that were set by defaults, it wouldn't be all that difficult or probably the other way around track everything explicitly set
kyle
kyleOP•2y ago
I'm generating some ash resources from a json definition, was hoping I could just monkeypatch the entities with the transformer but I don't want to overwrite any explicitly set options so...
ZachDaniel
ZachDaniel•2y ago
ah, interesting Yeah, I see what you're saying If you're interested, I could advise on how you might add this to spark to track all explicitly set options?
kyle
kyleOP•2y ago
no promises but sure
ZachDaniel
ZachDaniel•2y ago
no promises necessary 😆 So ignoring entities for now and just focusing on sections
opts =
case Spark.OptionsHelpers.validate(
current_config.opts,
Map.get(unquote(Macro.escape(section)), :schema, [])
) do
{:ok, opts} ->
opts

{:error, error} ->
raise Spark.Error.DslError,
module: __MODULE__,
message: error,
path: unquote(section_path)
end
opts =
case Spark.OptionsHelpers.validate(
current_config.opts,
Map.get(unquote(Macro.escape(section)), :schema, [])
) do
{:ok, opts} ->
opts

{:error, error} ->
raise Spark.Error.DslError,
module: __MODULE__,
message: error,
path: unquote(section_path)
end
for some reason github is being super weird and I can't link you to the file? anyway, its like line 956 in spark/dsl/extension.ex in spark that current_config.opts is everything that was actually set in code Later on, we store the result of that section being evaluted like this:
Process.put(
{__MODULE__, :spark, unquote(section_path)},
%{
entities: current_config.entities,
opts: opts
}
)
Process.put(
{__MODULE__, :spark, unquote(section_path)},
%{
entities: current_config.entities,
opts: opts
}
)
if we added something like set_keys: [:foo, :bar] then later, when we do the logic in set_state, we traverse all the sections and get their entities/opts
spark_dsl_config =
{__MODULE__, :spark_sections}
|> Process.get([])
|> Enum.map(fn {_extension, section_path} ->
{section_path,
Process.get(
{__MODULE__, :spark, section_path},
[]
)}
end)
|> Enum.into(%{})
|> Map.update(
:persist,
persist,
&Map.merge(&1, persist)
)
spark_dsl_config =
{__MODULE__, :spark_sections}
|> Process.get([])
|> Enum.map(fn {_extension, section_path} ->
{section_path,
Process.get(
{__MODULE__, :spark, section_path},
[]
)}
end)
|> Enum.into(%{})
|> Map.update(
:persist,
persist,
&Map.merge(&1, persist)
)
We could then put into the :persist key, something like
%{
spark_explicitly_set_keys: %{
[:section, :path] => [:foo, :bar]
}
}
%{
spark_explicitly_set_keys: %{
[:section, :path] => [:foo, :bar]
}
}
kyle
kyleOP•2y ago
interesting...I gave up poking around L1619 before, I'll take a look at this, thanks
ZachDaniel
ZachDaniel•2y ago
My pleasure 🙂 Spark can be pretty heady, but hopefully with some guidance we can get you there.
kyle
kyleOP•2y ago
pr sent, lmk if that looks right
ZachDaniel
ZachDaniel•2y ago
will review tomorrow 🙂

Did you find this page helpful?