Polymorphic resources
I have a number of resources that I'm beginning to add things like comments, attachments to, and would be interested what the idiomatic way to do this is.
For this case, I can see a macro or something like https://discord.com/channels/711271361523351632/1079057460700123186/1092860018862346300 working as a template, specifying a different resource name and table for each instance.
49 Replies
I would suggest writing an extension for that. Fragments are not meant for sharing behavior among resources, but for splitting up single resources.
I can show you what an extension might look like for something like that when I’m back at my computer.
That would be lovely, thanks mate!
What you might also actually do is use ash_postgres'
polymorphic? true
optionThis lets you have a single resource for
MyApp.Comment
for example, but have it create/manage many tables.
So then you'd do:
The migration generator will find all relationships to polymorphic resources and generate a table for every single one of them.Ok, interesting
Looks quite nice, seems like the one major tradeoff would be sacrificing the reverse relationship.
Yeah, that is correct.
Is there a way with Spark to write an extension that is similarly terse to a fragment?
If not, any downsides to combining extension + fragment for that purpose?
Well, the fragment won't have any dynamism
So you couldn't do things like modify the keys
i.e assuming
destination_attribute :post_id
on one resource and destination_attribute :other_thing_id
on one resource
Honestly this is probably not worth it, but:
The alternative, done with an extension, might look something like this:
So that actually defines the comments module for the resource automatically.
Would be how you configure it in a given resourceVery cool!
That looks pretty reasonable, but I understand what you mean that it might be a bit overkill.
It really just depends on how bought into the idea of constructing your app from your core domain + extensions 😆
I'd do the above, because I know how it all works
My thought was to hit somewhere in the middle and have an extension for the comments resource itself that would essentially do the same polymorphism as the data layer approach above.
Yeah, you can definitely do that
So youd do something like:
?
If I were to write it as a macro, something like:
The extension definitely does a bit more for you.
That will work, up to a point. If you try to use
expr
in there i.e to add actions that do certain things (not that you need to) then you may see some issues. Additionally, you won't be able to benefit from some compile time optimizations we make available to extensions. Its definitely okay to do what you've shown there. But FWIW I encourage people to "break the seal" so to speak on writing extensions as early as possible, because they are the most flexible way to extend your Ash app, and come with lots of tools to solve common metaprogramming problems.
But its totally up to you, and the above will likely work just fine 🙂Sure thing!
I have a number of macros that I would consider converting to Spark, I think it would also benefit error reporting and validation.
Actually, I was curious how one would write the above macro using Spark.
In this particular case, it's not a problem for comments/attachments to be defined in the parent resource (except making sure that GQL doesn't get upset), so the
comments
section is quite cool. This would also make it easy to add actions on that resource in the parent.You can do all sorts of things in spark transformers, like set DSL options, add entities (attributes/relationships) that kind of thing
its only partially automated. I.e you have
Ash.Resource.Builder.add_attribute
but for setting the table you'd need to do something like Transformer.set_option(dsl, [:postgres], :table, "...")
No worries, that makes sense. Is there a generic helper like that for appending to a section that supports multiple, or is that the behaviour of
set_option
already?
Starting to test out the Commentable
modules, pretty interesting stuff. Were the unquote
s intentional?For your first question you'd use
Transformer.add_entity
(or ideally the functions in Ash.Resource.Builder
)
there are sections, entities, and optionsUnfortunately looks like I may be running into problems mixing my existing macros with the extensions, as there's stuff defined there that this one wants.
The
unquote
was intentional, yesIt was erroring out for me on
dsl
not being defined.Ah, sorry
you need:
After removing them it seems to be doing better, but I derive my table name from the module in my base resource macro so it doesn't seem to be picked up.
I don't think thats the issue
Ah, right on. That makes more sense.
the table name should be set by the time you are in the transformer
Even when defined in a
use
macro?yep
I thought so as well... Then there must be something else going on.
I see other stuff generated by the macro in that struct, though.
That error looks strange
Can you paste the whole transformer in?
Interestingly, if I set
table "posts_comments"
explicitly, and add uuid_primary_key :id
, now i'm getting:
App.Extension.Commentable.Info.comments_table!(dsl)
that raises an error if the configuration is not set IIRC
You might need
Probably need
!
for Info.table
as well.
Ok, still getting the warning about not being an Ecto schema, but now a ton of GQL complaints about non-unique types.
Fixed the GQL stuff 🙂
Can I discover the registry of the parent with Spark and have the comment resource add itself?only kind of
What you actually have to do is write an extension for the registry 😆
Nice, will add that in as well.
Cool!
Is there a more optimal way to detect the
comments
section on the resource than
You can do
YourExtensionName in Spark.extensions(module)
Right on. I thought about just adding the extension to my base resource so I wanted to check if the section is used. Is there a way to detect the presence of the whole
comments
section rather than an entry in it? Or is it there implicitly and empty?
Tbh this is probably not the normal pattern 🙂Yeah, I wouldn't suggest doing that
There is no good way to detect if a section has been opened
Cool
If at all possible, I'd suggest just adding your extension to each resource, but otherwise you'd need to add some kind of boolean flag like
Welll...I guess maybe that isn't true
I think you can do
Map.has_key?(dsl, [:comments])
?
still, would be pretty non-idiomaticFor sure, I'll stick with letting the existence of the extension be the indicator.
Thanks a lot, mate!
Will definitely be looking for opportunities to use this going forward 🙂
Wanted to check back in on this one. I'm still seeing this warning:
Any idea what might be the cause?
🤔
Interesting.
So
App.Post
is the target Ash resource?App.Post
is the resource with the comments
block.🤔 but everything works otherwise?
If so, mind making an issue in Ash? I'm not sure there is anything we can actually do about it but I can look into it
Well, there is probably something we can do 🙂
Yep, will do when I get back to my desk.
Any idea what the issue might be?
The basic issue is that, as a part of compiling
App.Post
you are compiling App.Post.Comment
which then refers back to App.Post
.
Which is fine, except for ecto providing that warning.GitHub
Resources defined within other resources issue Ecto warnings · Issu...
When creating a subresource using an extension, the compiler issues the following warning: warning: invalid association
post
in schema App.Post.Comment: associated module App.Post is not an Ecto ...🙇