InvokeAI to Runpod serverless
Is it possible to link up InvokeAI to a serverless instance? I'm hoping to have it installed locally then use an external GPU, but not sure what is required to set that up. Wondering if there is a tutorial or something on doing so.
1 Reply
There's an easy way to figure it out: Maybe You can see how serverless works, and see if there public template for invoke Ai for runpod serverless then set them up, or you can build your own worker then set it up with your local invokeai
If the invoke Ai client allows it of course