Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Is Runpod's Faster Whisper Set Up Correctly for CPU/GPU Use? - Runpod
R
Runpod
•
16mo ago
•
1 reply
yccheok
Is Runpod's Faster Whisper Set Up Correctly for CPU/GPU Use?
Hi
, I
'm currently using Faster Whisper provided by Runpod
.
https://github.com/runpod-workers/worker-faster_whisper
While reviewing the code
, I found something confusing
:
https://github.com/runpod-workers/worker-faster_whisper/blob/main/builder/fetch_models.py#L15
Is there a specific reason for using
"cpu
" instead of
"gpu
"
?
Thanks
!
GitHub
GitHub - runpod-workers/worker-faster_whisper: 🎧 | RunPod worker of...
| RunPod worker of the faster
-whisper model for Serverless Endpoint
.
- runpod
-workers
/worker
-faster
_whisper
GitHub
worker-faster_whisper/builder/fetch_models.py at main · runpod-work...
| RunPod worker of the faster
-whisper model for Serverless Endpoint
.
- runpod
-workers
/worker
-faster
_whisper
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
Faster Whisper Latency is High
R
Runpod / ⚡|serverless
2y ago
0% GPU utilization and 100% CPU utilization on Faster Whisper quick deploy endpoint
R
Runpod / ⚡|serverless
2y ago
Can we use serverless faster Whisper for local audio?
R
Runpod / ⚡|serverless
2y ago
Failed Faster-Whisper task
R
Runpod / ⚡|serverless
11mo ago