Platform usage for jailbreak research
Hi there, I want to build a platform which will allow researchers to conduct jailbreak tests against open-source models in a controlled environment. This results be used to create datasets that can be shared with AI labs to improve their coverage of jailbreak mitigations. Is this something that would be allowed under your terms of service?
5 Replies
I expect this to be allowed, the no breaking tends to mean don't hack in their infrastructure. Jailbreaking is common practise in the LLM world
If it wasn't allowed the majority of my users are in violation haha
As long as your just prompting your stuff and they operate within your environment its fine
Confirming :) We love AI/ML Researchers, and anything in this field is permitted.
Makes an AI bot that submits random believable support tickets DJ be like "maybe not anything" 😛
Unknown User•2mo ago
Message Not Public
Sign In & Join Server To View
Brilliant thank you all!