galakurpismo3
galakurpismo3
RRunPod
Created by galakurpismo3 on 12/10/2024 in #⚡|serverless
I don't know my serverless balance goes down
I recommend using a Zendesk ticket yeah it's really helpful I lowered the maximum execution time to that endpoint so if a worker gets an error and stays running for too long, this shuts it down
15 replies
RRunPod
Created by galakurpismo3 on 12/10/2024 in #⚡|serverless
I don't know my serverless balance goes down
Hi, I opened the issue on zendesk too and I think that we solved it. But I have another question. I want to process a video using an AI model. In order to make this quicker, I separate the video in different clips and send each clip to a worker to process the clips in parallel So, for example, if I separate the video in 10 clips and send them to 10 workers, is it more expensive than sending it to 5 workers?
15 replies
RRunPod
Created by galakurpismo3 on 12/10/2024 in #⚡|serverless
I don't know my serverless balance goes down
image A
15 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
I tried with the filtering of CUDA 12.1 and nothing changed
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
But I'll try again with that
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
Actually, no, sorry, but the logs showed that CUDA 12.1 was running
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
let me know if you test anything or need anything
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
hi, here is a simple version of the worker: https://github.com/galakurpi/yekar_coaches_point_tracking_simple for testing it, send the video link i have in this code in that same format: import requests url = 'https://api.runpod.ai/v2/sd1ylpcd55dj12/run' data = { 'input': { 'video_url': 'https://drive.google.com/uc?export=download&id=1SER_MwYt0XyOHOX0UbN30iyMCmeWE-dd' } } headers = { 'Content-Type': 'application/json', 'Authorization': 'Bearer <RUNPOD API KEY MISSING>' # If authentication is needed } response = requests.post(url, json=data, headers=headers) print(response.json()) thank you
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
I can share it with you but it's not simple to test, I'll try to share a simplified version
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
I'll try this, I'll tell you if it works, thanks a lot for helping nvidia/cuda:12.1.0-cudnn8-devel-ubuntu20.04
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
aah okay, I'll try 11.8 too, thank you
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
what does this mean?
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
And for Jax I do this to install it: RUN pip install --upgrade "jax[cuda12_local]"
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
Yeah I tried with all gpus now
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
It looks like an issue with vscode there, I don't know if it would be related, I've tried with all gpus and I get the same error every time
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
Hi, I think that it worked but there is a new error now, related to cudnn I think, these are the logs: Starting Serverless Worker |  Version 1.6.0 --- {"requestId": "cbeb73b4-8679-43d1-aaa0-8c68101e76ac-e1", "message": "Started.", "level": "INFO"} Get inside input_fn xla_bridge.py       :889  Unable to initialize backend 'rocm': module 'jaxlib.xla_extension' has no attribute 'GpuAllocatorConfig' xla_bridge.py       :889  Unable to initialize backend 'tpu': INTERNAL: Failed to open libtpu.so: libtpu.so: cannot open shared object file: No such file or directory inference.py        :172  Found device: cuda:0 inference.py        :176  JAX is not using the GPU. Check your JAX installation and environment configuration. inference.py        :177  JAX backend: gpu inference.py        :182  CUDA_VISIBLE_DEVICES: 0,1 inference.py        :183  LD_LIBRARY_PATH: /opt/venv/lib/python3.9/site-packages/cv2/../../lib64:/usr/local/cuda/lib64:/usr/local/cuda/extras/CUPTI/lib64:/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/local/nvidia/lib:/usr/local/nvidia/lib64 inference.py        :187  libcudart.so loaded successfully. inference.py        :189  libcudnn.so loaded successfully. inference.py        :143  Read and resized video, number of frames: 107 E0716  cuda_dnn.cc:535 Could not create cudnn handle: CUDNN_STATUS_INTERNAL_ERROR E0716  cuda_dnn.cc:539 Memory usage: 84536328192 bytes free, 84986691584 bytes total. E0716  cuda_dnn.cc:535 Could not create cudnn handle: CUDNN_STATUS_INTERNAL_ERROR E0716  cuda_dnn.cc:539 Memory usage: 84536328192 bytes free, 84986691584 bytes total. inference.py        :162  Error during processing: FAILED_PRECONDITION: DNN library initialization failed. Look at the errors above for more details. {"requestId": "cbeb73b4-8679-43d1-aaa0-8c68101e76ac-e1", "message": "Finished.", "level": "INFO"} I've tried with 24GB GPU and 80GB GPU. I'm using this base image: FROM nvidia/cuda:12.0.0-cudnn8-devel-ubuntu20.04
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
okay, I'll try yes, thank you
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
okay, in the dockerfile, right?
55 replies
RRunPod
Created by galakurpismo3 on 7/15/2024 in #⚡|serverless
Can't use GPU with Jax in serverless endpoint
ok I'll run that command from the python code in the beginning and add your suggestion too
55 replies