using-slurm
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revision | |||
| using-slurm [2022/11/03 11:38] – jean-baka | using-slurm [2026/02/05 08:31] (current) – aorth | ||
|---|---|---|---|
| Line 82: | Line 82: | ||
| All output is directed to '' | All output is directed to '' | ||
| + | |||
| + | ==== Run job using a GPU ==== | ||
| + | Currently there is only one compute node with GPU capabilities. As of February 2026, compute06 has an NVIDIA Tesla v100 with 32GB of RAM. In order to use this you will need to add an extra " | ||
| + | |||
| + | For example, '' | ||
| + | |||
| + | < | ||
| + | #SBATCH -p batch | ||
| + | #SBATCH -w compute06 | ||
| + | #SBATCH --gres=gpu: | ||
| + | #SBATCH -n 8 | ||
| + | #SBATCH -J beast-GPU | ||
| + | |||
| + | # load module(s) | ||
| + | module load beagle/ | ||
| + | module load beast/ | ||
| + | |||
| + | beast -beagle_info</ | ||
| + | |||
| ==== Check queue status ==== | ==== Check queue status ==== | ||
| Line 91: | Line 110: | ||
| | | ||
| | | ||
| - | | + | |
| - | </ | + | |
using-slurm.txt · Last modified: by aorth
