✨ New: Portkey's Guardrails can now detect & validate multiple JSON objects within code blocks and i
✨ New: Portkey's Guardrails can now detect & validate multiple JSON objects within code blocks and in plain text.
We just shipped improved functionality for JSON Keys & JSON Schema Guardrail checks:
:a_tick: Detect JSON within code blocks and in plain text
:a_tick: Improved JSON extraction to handle multiple JSON objects within a single response
:a_tick: Enhanced reporting to return more informative data objects, including: Matched JSON, Present/missing keys (for JSON Keys plugin), Validation errors (for JSON Schema plugin), Explanatory messages for successful and failed validations
If you don't have access to the Guardrails feature, just DM me to enable it for your org!
14 Replies
Unknown User•9mo ago
Message Not Public
Sign In & Join Server To View
New Integration: Inference.net (wholesaler of LLM inference tokens)
Portkey is now integrated with the Inference.net API, which is a "wholesaler" of LLM inference tokens for open source models like Llama 3.
Inference.net is 50-90% cheaper than leading inference providers in the market, and could be especially useful for batch inference jobs!
Check out the integration docs here.

oh btw, OpenAI's new
o1-preview
and o1-mini
models are already supported on Portkey.
Compared to gpt-4o
, this model reflects for a long time, and answers questions like "how many r's in the word strawberry?", "how many words in your output?" etc exceptionally well.
It works significatly better on maths, science, puzzle solving, and coding tasks.
Any OpenAI user on Usage Tier 5 can already start playing with it!
Unknown User•9mo ago
Message Not Public
Sign In & Join Server To View
Thrilled to share that we are partnering with MongoDB to help companies take their AI apps to production!
This partnership supercharges your AI dev cycle:
1️⃣ MongoDB becomes your one-stop shop for all your LLM logs, telemetry, and embeddings. On your cloud, under your control.
2️⃣ Portkey becomes your central hub for LLM requests and API governance via our open source AI Gateway.
🤝 And, both together means, you'll have secure, scalable, and private deployments to propel your AI apps into production.
More on the partnership: https://cloud.mongodb.com/ecosystem/portkey-ai
Portkey + MongoDB docs: https://docs.portkey.ai/docs/product/enterprise-offering/components/log-store/mongodb

Unknown User•9mo ago
Message Not Public
Sign In & Join Server To View
The latest open source Llama 3.2 models - 1B & 3B Instruct and 11B & 90B Vision are now live on Portkey 🔥
✅ Switch your OpenAI & Anthropic vision requests to the cheaper & faster Llama models instantly
✅ Mitigate API failures with production-grade fallbacks
✅ Create and deploy Llama prompt templates with ease

News that people here may like: We are participating in Hacktoberfest and giving away Airpods Pro + some select Portkey swag to contributors!
Just head over to the Gateway repo, look for issues tagged with
hacktoberfest
and start contributing!
https://git.new/portkey
Unknown User•8mo ago
Message Not Public
Sign In & Join Server To View
Latest production-ready Gemini Pro & Flash models are now available on Portkey: https://new.portkey.ai/announcements/gemini-002-models-now-available-on-portkey
Key Improvements:
* 50% reduced pricing on 1.5 Pro (for prompts <128K)
* Increased rate limits: 2x higher on 1.5 Flash & 3x higher on 1.5 Pro
* Performance gains: 2x faster output generation & 3x lower latency
Gemini 002 Models Now Available on Portkey | Frill.co
We've added support for Google's latest Gemini models (1.5-Pro & 1.5-Flash) following their recent production updates. Key Improvements: 50% reduced pricing on 1.5 Pro (for prompts <128K) Increased rate limits: 2x higher on 1.5 Flash & 3x higher on 1
Unknown User•7mo ago
Message Not Public
Sign In & Join Server To View
Anthropic's prompt caching feature is now supported on the prompt playground!
Set any message to be cached by just toggling the
Cache Control
setting in the UI.
Special thanks to @rickydickydoo for championing this feature!
Unknown User•7mo ago
Message Not Public
Sign In & Join Server To View
PROMPT FOLDERS ARE NOW LIVE
- Organize your prompts in folders
- Move prompts from one folder to another
Try it out now!