...
Please fill out the following form. Make sure to follow the link on the application confirmation page to request an actual system account.
Submit Tech-Support Ticket
Please submit a tech-support ticket to the admin team.
Join HAL Slack User Group
Please join HAL slack user group.
Check System Status
Please visit the following website to monitor real-time system status.
User Guide
Connect to HAL Cluster
There are 2 methods to log on to the HAL system. The first method is to SSH via a terminal,
Code Block | ||||
---|---|---|---|---|
| ||||
ssh <username>@hal.ncsa.illinois.edu |
...
Submit Jobs Using Slurm Wrapper Suite (Recommended)
The first one is to use the Submit an interactive job using Slurm Wrapper Suite,
Code Block | ||
---|---|---|
| ||
swrun -p gpux1 |
The first one is to use the Submit a batch job using Slurm Wrapper Suite,
Code Block | ||
---|---|---|
| ||
swbatch run_script.swb |
...
Submit Jobs Using Native Slurm
and the second method is to submit with Submit an interactive job using Slurm directly.
Code Block | ||
---|---|---|
| ||
srun --partition=gpux1 --pty --nodes=1 --ntasks-per-node=12 \ --cores-per-socket=3 --threads-per-core=4 --sockets-per-node=1 \ --gres=gpu:v100:1 --mem-per-cpu=1500 --time=4:00:00 \ --wait=0 --export=ALL /bin/bash |
and the second method is to submit with Submit a batch job using Slurm directly.
Code Block | ||
---|---|---|
| ||
swbatch run_script.sb |
...
Submit Jobs Using HAL OnDemand (New)
Submit Interactive Job to HAL
There are 2 ways to submit interactive jobs to the HAL system. The first one is to use the Slurm Wrapper Suite,
Code Block | ||
---|---|---|
| ||
swrun -p gpux1 |
...
Log in with your own user name and password.
Code Block | ||
---|---|---|
| ||
srun --partition=gpux1 --pty --nodes=1 --ntasks-per-node=12 \
--cores-per-socket=3 --threads-per-core=4 --sockets-per-node=1 \
--gres=gpu:v100:1 --mem-per-cpu=1500 --time=2:00:00 \
--wait=0 --export=ALL /bin/bash |
Submit Batch Job to HAL
There are 2 ways to submit batch jobs to the HAL system. The first one is to use the Slurm Wrapper Suite,
Code Block | ||
---|---|---|
| ||
swbatch run_script.swb |
The run_script.swb example
Code Block | ||||
---|---|---|---|---|
| ||||
#!/bin/bash
#SBATCH --job-name="hostname"
#SBATCH --output="hostname.%j.%N.out"
#SBATCH --error="hostname.%j.%N.err"
#SBATCH --partition=gpux1
srun /bin/hostname # this is our "application" |
and the second method is to submit with Slurm directly.
Code Block | ||
---|---|---|
| ||
swbatch run_script.sb |
The run_script.sb example
...
language | bash |
---|
...
| ||
https://hal.ncsa.illinois.edu:8888 |
Files Apps
Jobs Apps
Clusters Apps
Documentation
...
System Overview
...
Manufacturer | Software Package | Version |
---|---|---|
IBM | RedHat Linux | 7.6 |
NVidia | CUDA | 10.1.105 |
NVidia | PGI Compiler | 19.4 |
IBM | Advance Toolchain | 12.0 |
IBM | XLC/XLF | 16.1.1 |
IBM | PowerAI | 1.6.1 |
SchedMD | Slurm | 19.05.2 |
OSC | Open OnDemand | 1.6.20 |
Job Management with Slurm
Frequently Asked Questions
...