container proxy deployed to railway is blocked by Open AI api, but fine when running on my laptop

I have a container based proxy that makes simple calls to OpenAI API. This works fine when I run it locally on my laptop, but when I deploy it to railway, it seems that OpenAI's API blocks requests from the container as I see the error below: raise error.Timeout("Request timed out: {}".format(e)) from e openai.error.Timeout: Request timed out: HTTPConnectionPool(host='whylogs-container.up.railway.app', port=8000): Max retries exceeded with url: /v1/chat/completions (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x10695a9d0>, 'Connection to whylogs-container.up.railway.app timed out. (connect timeout=600)' Any ideas on what my be causing this? I wrote a simple python script to test this. If i replace the host below with 'localhost', it works fine when the container is running on my laptop. doesn't work with the same container deployed to railway. import openai openai.api_key = "<replace with your Open AI API KeyXXXXX>" openai.api_base = "http://whylogs-container.up.railway.app:8000/v1" response = openai.ChatCompletion.create( model="gpt-3.5-turbo", messages=[ {"role": "user", "content": "hello world"} ] ) print(response)
5 Replies
Percy
Percy12mo ago
Project ID: N/A
young90803
young9080312mo ago
N/A
Brody
Brody12mo ago
im sorry to say but if open ai is blocking the shared ips that railway use theres not much railway can do
young90803
young9080312mo ago
is this a known issue? have you seen anyone else run into this? i can't find much detail in public forums
Brody
Brody12mo ago
first ive heard of any users reporting this
Want results from more Discord servers?
Add your server