what stack should i go with for LLM wrappers?
so lets put it like this: I've got a nextjs frontend already in place
now i’m working on some pretty heavy LLM tasks like evaluating full github repos (something like using openai's codex) and even speech2speech stuff (like openai's whisper)
now i’m kinda stuck on what language/stack i should be using to build this LLM wrapper part becuase vercel’s ai-sdk maybe doesn’t give me everything i need.
i’m debating between python or javascript for the backend part,
just want something that's scalable, manageable and doesn't turn into a mess later on
0 Replies