People! Gemini 2.5 Flash is available in the prompt playground now!!
Route to the latest models from OpenAI - `o3` & `o4-mini` through Portkey! Available on both the `/c
o3
& o4-mini
through Portkey! Available on both the /chat/completions
and /responses
endpoints
We've redesigned the Portkey sidebar and put all the features under 3 core categories:
🔥 o3-mini is now available on Portkey! Use it across both OpenAI & Azure OpenAI seamlessly

🚀 Just shipped: **Audit Logs**!!

@everyone We are starting the webinar for LLMs in Prod report now! https://meet.google.com/epk-cctk-
If you're building any AI app that uses RAG, you need to check out the Perplexity API — they have so
🔐 Managing developer access just got easier on Portkey!

The little things: Code blocks in Portkey logs now show their language type. Small upgrade, nicer ex

We also mapped out the whole MCP ecosystem put and all available MCP servers, from reference impleme
@everyone Announcing Portkey's own MCP Client!
The Open source Gateway now spins up a *mini console* that logs all of your requests in one place wi

New: Programatically create new virtual keys that refer to your Azure deployments.
New on Portkey prompts: Move multiple prompts to another folder or delete them.
@here Join in for our weekly office hours on https://discord.gg/J2mnVkWddu
**PROMPT FOLDERS ARE NOW LIVE**
Anthropic's prompt caching feature is now supported on the prompt playground!
Cache Control
setting in the UI.
Special thanks to @rickydickydoo for championing this feature!...
Latest production-ready Gemini Pro & Flash models are now available on Portkey: https://new.portkey.
News that people here may like: We are participating in Hacktoberfest and giving away Airpods Pro +
hacktoberfest
and start contributing!
https://git.new/portkey...
The latest open source Llama 3.2 models - 1B & 3B Instruct and 11B & 90B Vision are now live on Port
