Hi there. I'd like to implement by own streaming custom routes like the vllm worker ( https://github.com/runpod-workers/worker-vllm). This worker supports routes like, https://api.runpod.ai/v2/<YOUR ENDPOINT ID>/openai/v1. How is this done? When I look in the source code that worker gets special keys passed to it in the rp handler like job_input.openai_route. Where does this key come from?