`Spark` templates
I have a number of resource templates that are defined as
What I'm wondering is if we can have the best of both worlds. For example, by allowing extensions to provide a
This would be even more interesting if extensions were able to inject other extensions with options taken from their own DSL, but this may be a harder hill to climb.
Would the compile-time checks and error reporting be worth the effort? Any big impediments to allowing extensions to take options, even without necessitating them being specified in the DSL? Is this already possible?
A very boring example:
Related: https://discord.com/channels/711271361523351632/1019647368196534283/threads/1110677874198986852: re: a callback to allow extensions to add other extensions
__using__ macros, primarily for the sake of ergonomics and readability. As some of these have begun to solidify, I have converted them to Spark extensions.What I'm wondering is if we can have the best of both worlds. For example, by allowing extensions to provide a
Spark options schema and selecting a syntax to walk in Spark.Dsl.Fragment (or a new Spark.Dsl.Template) in order to inject options into the fragment/template.This would be even more interesting if extensions were able to inject other extensions with options taken from their own DSL, but this may be a harder hill to climb.
Would the compile-time checks and error reporting be worth the effort? Any big impediments to allowing extensions to take options, even without necessitating them being specified in the DSL? Is this already possible?
A very boring example:
Related: https://discord.com/channels/711271361523351632/1019647368196534283/threads/1110677874198986852: re: a callback to allow extensions to add other extensions
Discord
Discord is the easiest way to communicate over voice, video, and text. Chat, hang out, and stay close with your friends and communities.
