distributed training

Is it possible to set up a slurm cluster for distributed training on Runpod?
Was this page helpful?