© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Search
Star
Feedback
Setup for Free
Axolotl Fine Tune Error (flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol) - Runpod
R
Runpod
•
12mo ago
solanotodeschini
Axolotl Fine Tune Error (flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol)
Hi
! I was using axolotl image for fine tuning successfully but now I
'm getting this error
:
flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol
flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol
Everything was working normally until yesterday
. I
'm following the steps in the fine tuning tutorial
:
https://docs.runpod.io/tutorials/pods/fine-tune-llm-axolotl#using-a-hugging-face-dataset
Fine tune an LLM with Axolotl on RunPod | RunPod Documentation
Learn how to fine
-tune large language models with Axolotl on RunPod
, a streamlined workflow for configuring and training AI models with GPU resources
, and explore examples for LLaMA2
, Gemma
, LLaMA3
, and Jamba
.
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,906
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Recent Announcements
Similar Threads
Do 2 GPUs will fine tune 2 times faster than 1 GPU on axolotl ?
R
Runpod / ⛅|pods
2y ago
The "Fine tune an LLM with Axolotl on RunPod" tutorial should mention uploading public key first
R
Runpod / ⛅|pods
16mo ago
CUDA error
R
Runpod / ⛅|pods
5mo ago
CUDA error
R
Runpod / ⛅|pods
5mo ago